An Interesting phenomenon of Lua's table -


i'm new lua, , i'm learning usage of table these days. tutorials know lua treats numeric indexed items , non-numeric indexed items differently, did tests myself, , today found interesting phenomenon , can't explain it:

the code

t = {1, 2, 3, a='a', b='b'} print(#t) 

gets

3 

because # operator counts numeric indexed items only. tested following code

t = {1, 2, 3, a='a', b='b'} print(#t)  = 100,200     t[i] = end print(#t) 

i get

3 3 

till think lua treats discontinuous items added later non-numeric indexed ones. however, after change code little bit

t = {1, 2, 3, a='a', b='b'} print(#t)  = 100,300     t[i] = end print(#t) 

i get

3 300 

i'm confused phenomenon, know reason? thanks.

(this phenomenon can reproduced @ http://www.lua.org/cgi-bin/demo)

update:

i tried code

t = {1, 2, 3, a='a', b='b'} print(#t)  = 100,300     t[i] =     print("add", i, #t) end  = 100,300     t[i] = nil     print("del", i, #t) end 

i get

3 add 100 3 add 101 3 add 102 3 ... add 223 3 add 224 3 add 225 3 add 226 226 add 227 227 add 228 228 ... add 298 298 add 299 299 add 300 300 del 100 300 del 101 300 del 102 300 ... del 253 300 del 254 300 del 255 300 del 256 3 del 257 3 del 258 3 ... del 298 3 del 299 3 del 300 3 

this example shows lua convert table between sparse , dense.

i haven't looked @ how # operator implemented, bet what's going on adding 100 indexes, you've caused range 1-300 become dense enough indexes 100-300 end in "array" part of table implementation instead of "hash" part.

update:

ok, looked @ source primitive table length. if final entry in array part nil, binary-searches array find lowest "boundary" (a non-nil index followed nil index). if it's not nil, decides boundary must in hash , searches it.

so table containing numeric indexes {1, 2, 3, 100..200}, assume it's not dense enough , array part contains {1, 2, 3}. table containing {1, 2, 3, 100..300}, it's presumably dense enough array part ends somewhere within 100..300 part (i think array part power of 2, can't possibly end @ 300, i'm not 100% positive).

update 2:

when lua table rehashed, counts number of integer keys. walks powers of 2 no more twice number of integral keys, , finds largest power of 2 @ least 50% dense (meaning if array part large, @ least 50% of values non-nil).

so {1, 2, 3, 100..200}, walks up

1: 100% dense; 2: 100% dense; 4: 75% dense; bad 8: 37.5% dense; bad 16: 18.75% dense; bad 32: 9.375% dense; bad 64: 4.6875% dense; bad 128: 25% dense; bad 256: 40.625% dense; bad 

the best value 2, ends array size of 2. since 2 non-nil, searches hash boundary , finds 3.

once add 201..300 last step becomes

256: 62.5% dense; 

which causes array part cover 1..256, , since 256 non-nil, again searches boundary in hash , gets 300`.


in end, lua 5.2 defines "sequence" table exclusively integral keys starting @ 1 , going no holes. , defines # being valid sequences. way lua can away weird behavior noticed tables have holes in integral sequences.


Comments

Popular posts from this blog

Why does Ruby on Rails generate add a blank line to the end of a file? -

keyboard - Smiles and long press feature in Android -

node.js - Bad Request - node js ajax post -