Daiki Ueno ueno@gnu.org writes:
Makefile.in | 4 +- nettle-internal.h | 2 +- nettle-meta-hashes.c | 2 + nettle-meta.h | 2 + nettle.texinfo | 68 + sha3.c | 13 + sha3.h | 56 + shake128-meta.c | 42 + shake128.c | 84 + shake256-meta.c | 42 + shake256.c | 84 + testsuite/.test-rules.make | 6 + testsuite/Makefile.in | 1 + testsuite/meta-hash-test.c | 2 + testsuite/shake.awk | 14 + testsuite/shake128-test.c | 6183 ++++++++++++++++++++++++++++++++++++++++++++ testsuite/shake256-test.c | 6183 ++++++++++++++++++++++++++++++++++++++++++++ testsuite/testutils.c | 43 +- 18 files changed, 12812 insertions(+), 19 deletions(-) create mode 100644 shake128-meta.c create mode 100644 shake128.c create mode 100644 shake256-meta.c create mode 100644 shake256.c create mode 100755 testsuite/shake.awk create mode 100644 testsuite/shake128-test.c create mode 100644 testsuite/shake256-test.c
This is a contribution that is mostly independent of the rest of the curve448 patches.
diff --git a/nettle-meta-hashes.c b/nettle-meta-hashes.c index 2220968c..1cd7f677 100644 --- a/nettle-meta-hashes.c +++ b/nettle-meta-hashes.c @@ -50,5 +50,7 @@ const struct nettle_hash * const nettle_hashes[] = { &nettle_sha3_256, &nettle_sha3_384, &nettle_sha3_512,
- &nettle_shake128,
- &nettle_shake256, NULL
};
Does it make sense to group shake (wich is a "xof") to the hash functions? I think we could skip the -meta things for now, and design an interface for xof-like functions later.
+@subsubsection @acronym{SHAKE128}
+In addition to those SHA-3 hash functions, Nettle also provides two +SHA-3 extendable-output functions (XOFs). Unlike SHA-3 hash functions, +the output of SHA-3 XOFs can be extended to any desired length.
Maybe some reference to the section on key derivation functions would be helpful, since they serve a similar purpose.
+Nettle defines SHAKE128 in @file{<nettle/sha3.h>}.
+@deftp {Context struct} {struct shake128_ctx} +@end deftp
+@defvr Constant SHAKE128_DIGEST_SIZE +The size of a SHAKE128 digest, i.e. 64. +@end defvr
Does it make sense at all to define a digest size?
+@deftypefun void shake128_digest (struct shake128_ctx *@var{ctx}, size_t @var{length}, uint8_t *@var{digest}) +Performs final processing and extracts the message digest, writing it +to @var{digest}. @var{length} may be smaller than +@code{SHAKE128_DIGEST_SIZE}, in which case only the first @var{length} +octets of the digest are written.
Isn't the point to allow arbitrary "digest size", which is perhaps better called "output size"?
In case it makes sense to use a very long digest size, e.g., using shake as a stream cipher, should we provide some method to get the output incrementally, one or more blocks at a time?
+void +_sha3_shake_pad (struct sha3_state *state,
unsigned block_size, uint8_t *block, unsigned pos)
+{
- assert (pos < block_size);
- block[pos++] = 0x1F;
- memset (block + pos, 0, block_size - pos);
- block[block_size - 1] |= 0x80;
- sha3_absorb (state, block_size, block);
+}
Difference to _sha3_pad is the magic constant 0x1f, instead of 6? Maybe add the padding value as an argument to _sha3_pad, rather than introducing a new function? The function is not part of the supported api, and from a search at https://codesearch.debian.net/search?q=_sha3_pad, it's seems it's also not used outside of nettle.
+#define SHAKE128_DIGEST_SIZE 0 /* not used */
I think it's better to not define _DIGEST_SIZE at all.
+#define SHAKE128_BLOCK_SIZE 168
How is the block size motivated? It gives roughly same security as what sha3-128 would give, i.e., "64 bit security" against collisions, and "128 bit security" against second preimage? It seems to affect both processing of input and output.
+struct shake256_ctx +{
- struct sha3_state state;
- unsigned index;
- uint8_t block[SHAKE256_BLOCK_SIZE];
+};
This is identical to sha3_256_ctx, right? Would it make sense to use the same context, but with a separate function to be used instead of sha3_256_digest? Both _init and _update are also identical, if I'm not missing something. Could be called sha3_256_shake or sha3_256_xof or something like that? Maybe one function to produce all output and reset the context, and a separate function or functions to produce output incrementally?
Regards, /Niels