As I have been working much with JSON (http://json.org/ or RFC 4627, http://rfc.roxen.com/4627 for those who have missed it), I recently hacked up some tools and very sloppy Pike support for it. (The first 95% of the decoder bits were kindly contributed by Martin Nilsson.)
Neither encoder nor decoder is quite up to pike standards, API-wise nor code quality wise, so we are reluctant to commit them into pike core as is, but if someone wants to shape them up a bit and get them all the way, feel free. The code (and three quick examples, one that indents a JSON file for readability, a JSON-to-pike-literal translator and a minifier, the latter of which is particularly un-recommended to use for anything but data in the printable ASCII range) is here:
http://svn.devjavu.com/ecmanaut/trunk/tools/json/ http://ecmanaut.devjavu.com/projects/ecmanaut/changeset/5
The latter URL is a Trac view of the initial commit, for the curious. Both encoder and decoder only really handle Pike native types. Neither verifies the correctness of their input/output.
The encoder will happily output illegal JSON if you feed it structures illegal according to the JSON specification (your mapping indices must be strings to produce valid JSON). It presently wastes some resources to sort keys of mappings, not letting you turn that off. It can not produce the three values null, true and false. It also does not quote strings properly, so control characters and Unicode escapes may not come out as legal JSON.
And the APIs are lightly wrapped semi-smelly beasts. But I think those are my only gripes. :-)
Usage is at least straight-forward (more detailed docs in the code):
string encode_json( mixed data ); mixed decode_json( string|Stdio.File input );