If performance does not matter, you could add a pike function to the JSON modules called encode_anyway(), which recursively filters the data and passes it on to encode. I personally prefer having a somewhat strict encoder/decoder, since that also prevents programmer mistakes, which could otherwise go unnoticed.
The duplicate key problem you mention is not easy to solve. It would also make the encoded mappings non-deterministic, since mappings have no order.
arne
ps. i would commit a slightly modified version of your patch, unless you think we need to solve the mapping problem, too
On Thu, 23 May 2013, Martin B�hr wrote:
hi,
dealing with objects is pretty straight forward, but it gets more interesting with mapping-keys.
json only allows strings as keys, and the current json module hence fails at converting keys that are not strings.
the problem i need to solve involves converting user-data into json without failing, and short of walking the input data to weed out bad values, catching them in the encode step is the only way.
python http://stackoverflow.com/questions/1450957/pythons-json-module-converts-int-... converts ints simply to strings, which is one option, but i think inserting the callback here too, to let the caller deal with the problem is still better.
now i am wondering what arguments to send to the caller? just the key? or the key and the value? or the whole mapping?
one reason to send the whole mapping is help check for like ([ 1:"a", "1":"a" ]) which could result in duplicate keys, if the callback simply does { return (string)key; }
passing the whole mapping to the callback would allow the caller to analyze the situation, but somehow i feel it also would make things complicated.
one option would be to accept the value from the callback but check for and throw on duplicate keys in the module. that should deal with the most normal cases and only throw in unusual ones.
greetings, martin.