Collectives™ on Stack Overflow

Find centralized, trusted content and collaborate around the technologies you use most.

Learn more about Collectives

Teams

Q&A for work

Connect and share knowledge within a single location that is structured and easy to search.

Learn more about Teams

Our team is using a SecureRandom to generate a list of key pairs (the SecureRandom is passed to a KeyPairGenerator). We cannot agree on which of the following two options to use:

  • Create a new instance every time we need to generate a key pair

  • Initialize a static instance and use it for all key pairs

  • Which approach is generally better and why ?

    ADDED: My gut feeling is that the second option is more secure. But my only argument is a theoretical attack based on the assumption that the pseudorandomness is derived from the current timestamp: someone may see the creation time of the key pair, guess timestamps in the surrounding time interval, compute the possible pseudorandom sequences, and obtain the key material.

    ADDED: My assumption about determinism based on a timestamp was wrong. That's the difference between Random and SecureRandom. So, it looks like the answer is: in terms of security it doesn't really matter.

    Unlike the java.util.Random class, the java.security.SecureRandom class must produce non-deterministic output on each call.

    What that means is, in case of java.util.Random , if you were to recreate an instance with the same seed each time you needed a new random number, you would essentially get the same result every time. However, SecureRandom is guaranteed to NOT do that - so, creating a single instance or creating a new one each time does not affect the randomness of the random bytes it generates.

    So, from just normal good coding practices view point, why create too many instances when one will do?

    Please, please don't use this as justification to use the same PRNG. The docs are ambiguous, and if it's wrong , you just made cracking all your keys as easy as figuring out the seed to the PRNG. Nick Johnson Nov 19, 2008 at 10:31 The secure randoms will be seeded by the systems random number generator, also according to the docs. In my opinion above comment is not correct anymore (if it ever was). It would be strange if this would be an issue; imagine two different parts of the same application create a secure random; you would not want that random to be insecure, right? Maarten Bodewes Feb 25, 2018 at 11:36 If you create a SecureRandom with the same seed then you would get the same result everytime, at least when you use the Oracle default. Only if you let the SecureRandom seed itself or if you reseed it using the generateSeed method are you guaranteed non-deterministic behavior. Maarten Bodewes Feb 25, 2018 at 12:06

    For SecureRandom you would want to consider occasionally reseeding ( using system entropy in most cases ) via a call like so:

    mySecureRandom.setSeed(mySecureRandom.generateSeed(someInt));
    

    so as to give a potential attacker something less than unlimited time to discover your key.

    There's some great writeups about this consideration at the Justice League blog.

    I've voted this up as it contains correct information, but please note that this answer does not answer the question posed on how many instances should be created. – Maarten Bodewes Feb 25, 2018 at 12:16 I have a reason to believe that it would actually be more random, but I'm waiting for argument-backed replies to verify or refute my theory. Thanks for the quick reply :) – ngn Nov 17, 2008 at 14:14 If you want true randomness, then you need a non-deterministic source. There's a link on SO, somewhere... – Mitch Wheat Nov 17, 2008 at 14:18 This is likely correct, but I have downvoted it anyway as it isn't backed up by any argumentation, rendering it useless. – Maarten Bodewes Feb 25, 2018 at 12:07

    Every SecureRandom generation is seeded from some entropy pool. Depending on the OS used, this might be the entropy pool maintained by the OS like /dev/random on Linux, or might be something that the JVM cooks up. In some earlier implementations, the Sun JVM used to spawn a number of threads and use their timing data to create the seed.

    Creating a new SecureRandom on every call might cause slow down of the application since creation of the seed might be blocking. Its better to reuse the a statically created instance, but make sure to reseed it after a fixed number random bytes are extracted from it.

    You may want to create a wrapper over a SecureRandom instance which counts the number of bytes extracted in nextBytes or generateSeed calls and after a number of bytes, reseeds the internal SecureRandom instance by using system entropy pool.

    The wrapper approach however is not possible on Java on Linux since the SecureRandom instance you get from new SecureRandom() is nothing but a wrapper on /dev/random and every call for nextBytes or generateSeed actually drains the OS entropy pool. On Linux and Solaris, its better to use a JCE provider for SecureRandom creation.

    Many folks are wondering why to reseed every so often. It isn't because the numbers get less random. It is because the longer you use the PRNG without reseeding, you are giving a potential attacker that same amount of time to try and brute-force find your seed. While I agree that you should reseed periodically, it can be on a modest interval (hours to a day?), not every time. – Matthew McCullough Jul 19, 2010 at 3:56 Reseeding every time is not implied by the answer, only after a fixed number of times. But be warned: this answer contains a lot of implementation specific information, and a lot of things have changed in the mean time. For instance, the Linux implementation uses the non-blocking /dev/urandom by now but may be reconfigured to use /dev/random. – Maarten Bodewes Feb 25, 2018 at 12:13

    I would not rely on SecureRandom to be anything other than a cryptographically secure PRNG. The complete quote that Gowri is using from the javadocs is:

    Additionally, SecureRandom must produce non-deterministic output and therefore it is required that the seed material be unpredictable and that output of SecureRandom be cryptographically strong sequences as described in RFC 1750: Randomness Recommendations for Security.

    It's less than clear from this what the real expectation is - RFC 1750 details the use of hardware to enhance random number generation, but the javadocs say "therefore it is required that the seed material be unpredictable", which would seem to contradict this.

    The safest assumption to work on is that your implementation of SecureRandom is simply a cryptographically-secure PRNG, and therefore that your keys are no more secure than the random seed that you use. Thus, initializing a new SecureRandom with a new (unique, truly random) seed for each key would be the safest bet.

    It's turtles all the way down! Seriously, any good Java platform uses the system provided RNG to seed its SecureRandom instances, and it is extremely unlikely that you can do much better than that. Requesting people to generate their own seed is like asking people to build their own car; sure some may succeed but the majority will muck it up. – Maarten Bodewes Feb 25, 2018 at 12:02 It's less than clear from this what the real expectation is... Why? RFC 1750 says this: In cases where a series of random quantities must be generated, an adversary may learn some values in the sequence. In general, they should not be able to predict other values from the ones that they know. Sounds like it's saying the sequence must not be deterministic. What I find confusing is the apparent contradiction a little further in the Javadoc: Many SecureRandom implementations ... use a deterministic algorithm to produce a pseudo-random sequence from a true random seed. So which is it? – shmosel Jun 20, 2019 at 0:48 Alex, as I tacked on to some other responses below, wouldn't you want to reseed occasionally so that attackers don't have "unlimited" time to try to discover your seed? – Matthew McCullough Jul 19, 2010 at 3:59

    I decided to ask the Java compiler. Short answer is that, yes, re-using the SecureRandom object has some performance benefits but is no bettor or worse wrt actual randomness. This is purely a tuning issue. Not a security issue.

    Note, however, it takes a little while for the JIT to kick in so you see the benefits. The take-away is that for heavy/frequent use, definitely re-use the object. For infrequent use, you might be better off use a new object every time.

    Results

    warm up 
    -----------------------------
    default seed - re-use - 1807 ms
    explicit seed - re-use - 835 ms
    constant seed - new every time - 1044 ms
    default seed - new every time - 1621 ms
    -----------------------------
    interation 0
    -----------------------------
    default seed - re-use - 412 ms
    explicit seed - re-use - 418 ms
    constant seed - new every time - 955 ms
    default seed - new every time - 1676 ms
    -----------------------------
    interation 1
    -----------------------------
    default seed - re-use - 389 ms
    explicit seed - re-use - 369 ms
    constant seed - new every time - 893 ms
    default seed - new every time - 1498 ms
    

    Source

    package foo;
    import static org.junit.Assert.assertEquals;
    import static org.junit.Assert.assertTrue;
    import java.security.SecureRandom;
    import java.util.HashSet;
    import java.util.Set;
    import org.junit.BeforeClass;
    import org.junit.Test;
    public class SecureRandomTest {
        static long elapsedMillis( long startNs ) {
            long now = System.nanoTime();
            return (now - startNs) / 1_000_000;
        final static long seed = 123456789123456L;
        final static int nIter = 1000000;
        public static void main(String[] args) {
            warmup();
            SecureRandomTest test = new SecureRandomTest();
            for ( int ix = 0; ix < 5; ++ix ) {
                test.run(ix);
        void run(int ix) {
            System.out.printf( "interation %d\n-----------------------------\n", ix);
            secure_random_default_seed_reuse();
            secure_random_constant_seed_reuse();
            secure_random_constant_seed();
            secure_random_default_seed();
            System.out.println("-----------------------------");
        /* Warm up JVM/JIT */
        @BeforeClass
        public static void warmup() {
            new SecureRandomTest().run(-1);
        @Test
        public void secure_random_constant_seed() {
            long started = System.nanoTime();
            int nDupes = 0, ix = 0;
            Set<Long> generated = new HashSet<>(nIter);
            for ( /**/; ix < nIter; ++ix) {
                SecureRandom rand = new SecureRandom();
                rand.setSeed(seed);
                long xRand = rand.nextLong();
                if ( !generated.add(xRand) ) {
                    ++nDupes;
            assertEquals( "Unexpected # of dupes " + nDupes + ", ix == " + ix, nIter-1, nDupes );
            System.out.printf( "constant seed - new every time - %d ms\n", elapsedMillis(started) );
        @Test
        public void secure_random_constant_seed_reuse() {
            long started = System.nanoTime();
            int nDupes = 0, ix = 0;
            SecureRandom rand = new SecureRandom();
            rand.setSeed(seed);
            Set<Long> generated = new HashSet<>(nIter);
            for ( /**/; ix < nIter; ++ix) {
                long xRand = rand.nextLong();
                if ( !generated.add(xRand) ) {
                    ++nDupes;
            assertTrue( "Unexpected # of dupes " + nDupes + ", ix == " + ix, 0 == nDupes );
            System.out.printf( "explicit seed - re-use - %d ms\n", elapsedMillis(started) );
        @Test
        public void secure_random_default_seed() {
            long started = System.nanoTime();
            int nDupes = 0, ix = 0;
            Set<Long> generated = new HashSet<>(nIter);
            for ( /**/; ix < nIter; ++ix) {
                long xRand = new SecureRandom().nextLong();
                if ( !generated.add(xRand) ) {
                    ++nDupes;
            assertTrue( "Unexpected # of dupes " + nDupes + ", ix == " + ix, 0 == nDupes );
            System.out.printf( "default seed - new every time - %d ms\n", elapsedMillis(started) );
        @Test
        public void secure_random_default_seed_reuse() {
            long started = System.nanoTime();
            int nDupes = 0, ix = 0;
            SecureRandom rand = new SecureRandom();
            Set<Long> generated = new HashSet<>(nIter);
            for ( /**/; ix < nIter; ++ix) {
                long xRand = rand.nextLong();
                if ( !generated.add(xRand) ) {
                    ++nDupes;
            assertTrue( "Unexpected # of dupes " + nDupes + ", ix == " + ix, 0 == nDupes );
            System.out.printf( "default seed - re-use - %d ms\n", elapsedMillis(started) );
                    Given modern gc impls, I know it wouldn't affect performance at all.  But the question is would it be less random then?
    – ngn
                    Nov 17, 2008 at 14:12
                    The randomization doesn't decline over extended use. It is just that you are giving an attacker "time" to find your first (only?) seed. The opposite option, reseeding every time, is generally too expensive.
    – Matthew McCullough
                    Jul 19, 2010 at 3:58
            

    Thanks for contributing an answer to Stack Overflow!

    • Please be sure to answer the question. Provide details and share your research!

    But avoid

    • Asking for help, clarification, or responding to other answers.
    • Making statements based on opinion; back them up with references or personal experience.

    To learn more, see our tips on writing great answers.