Hello,
I am trying to recreate a data type in java used by adobe, it is an unsigned integer 30 bits long or U30 by the specification. Anyway i am having problems understanding the binary output, the definition states:

"The variable-length encoding for u30 uses one to five bytes, depending on the magnitude of the value encoded. Each byte contributes its low seven bits to the value. If the high (eighth) bit of a byte is set, then the next byte of the abcFile is also part of the value."- learn.adobe.com/wiki/display/AVM2/4.1+Primitive+data+types

In order to understand the output i took a look at a simple compiled actionscript file containing a U30 and got:

00011101   00000100  01001101   01100001
Hx 1D         4        4D         61

This was the binary output from a simple u30 structure, the value should be a count of the number of integer values in the script. However when i add more integer values to the script the output becomes:

00100010   00000100  01001101   01100001
Hx 21          4        4D         61

Essentially only affecting the first byte, which would be understandable, however the script file only contains 3 extra integer values, so i am confused as to how the first byte starts with a decimal value of 29 integers?

So could anyone tell me the best way to implement this data type, and whether it is the bit arrangement that makes it start with 28 values?

after you added

hey, there was only one integer value in the first example and in the second i added an extra 3, meaning there was 1 in the first and 4 integers total in the second.

Which is why i am confused as to why it starts with a minimal decimal value of 29 for a compiled swf with 1 integer value. Id have assumed it must start at 0 or atleast 1, thats why i am assuming it must be the way the bits are organised to give the real value?

when i add more integer values

How many values did you add? The value of the first byte in what you posted went from x1D (29) to x22 (34)
Note:00100010 is x22

Since none of the high order(left most) bits are set, your posts each show 4 values.

Can you explain what your problem is and post the code and its output you are having trouble with?

Edited 4 Years Ago by NormR1

st) bits are set, your posts each show 4 values.

Can you explain what your problem is and post the c

The problem is the first set :

00011101   00000100  01001101   01100001
Hx 1D         4        4D         61

Only contained one value, yet its decimal value starts at 29, and the second set of hex contains 3 new values or 4 in total and is equivelantly therefore 34 in decimal.
My question is really where does the original 28 values come from to make adding one 29?
Does this pertain to the use of bits to make the final value? Or is it genuinely saying i have 28 values?
Example:
00000001 would be saying i had a single value if were just in a standard byte
00011101 somehow says i have 29 values, when it should only be one, or am i not reading the structure of the binary correctly, am i to ommit the first 7 bits?

basically i need help understanding how the binary is laid out as the specification is vague when it comes to reading the values, as i want to write a java program to read the binary data but i cannot thenless i understand how 29 in binary somehow really means 1.

Any help would be awesome :)

001

Just realized, i wasnt clear when im refereing to values i mean the number that the first byte should be.
So 29 should be a hex value that is equivelant to the number of integer values in a script. When i have infact only integer value in a script, almost like it skips the first 28. I am not refering to the number of bytes, or hex valuesi posted.

Edited 4 Years Ago by trishtren

Only contained one value, yet its decimal value starts at 29,

Sounds like you do not have the correct definiton of what the values represent.
You say there is one item and the value of the data is 29.

Edited 4 Years Ago by NormR1

is one item and the value of the data is 29.

yes, the actionscript file i compiled has only one new integer declaration.
And the hex that i posted is the bytes containing the number of integers in the file. But for some reason it starts at 28 and when i added a new integer to the script it goes up to 29 and etc with the other examples. SO the count seems to be correct, it just seems to be start at 28 instead of well...0 or 1. So what i was confused with is whether this was because of the organisation of the bits, i.e the left most being ignored thenless its one and the right 7 being the value. But since no one seems to understand how those values are set it would seem its been a lack of definition in the adobe documentation...again..... Thanks anyway guys!

This question has already been answered. Start a new discussion instead.