Manpages

NAME

rand − Pseudo random number generation.

DESCRIPTION

This module provides a pseudo random number generator. The module contains a number of algorithms. The uniform distribution algorithms use the xoroshiro116+ and xorshift1024* algorithms by Sebastiano Vigna. The normal distribution algorithm uses the Ziggurat Method by Marsaglia and Tsang on top of the uniform distribution algorithm.

For some algorithms, jump functions are provided for generating non-overlapping sequences for parallel computations. The jump functions perform calculations equivalent to perform a large number of repeated calls for calculating new states.

The following algorithms are provided:

exrop:

Xoroshiro116+, 58 bits precision and period of 2^116-1

Jump function: equivalent to 2^64 calls

exs1024s:

Xorshift1024*, 64 bits precision and a period of 2^1024-1

Jump function: equivalent to 2^512 calls

exsp:

Xorshift116+, 58 bits precision and period of 2^116-1

Jump function: equivalent to 2^64 calls

This is a corrected version of the previous default algorithm, that now has been superseded by Xoroshiro116+ (exrop). Since there is no native 58 bit rotate instruction this algorithm executes a little (say < 15%) faster than exrop. See the algorithms’ homepage.

The default algorithm is exrop (Xoroshiro116+). If a specific algorithm is required, ensure to always use seed/1 to initialize the state.

Undocumented (old) algorithms are deprecated but still implemented so old code relying on them will produce the same pseudo random sequences as before.

Note:

There were a number of problems in the implementation of the now undocumented algorithms, which is why they are deprecated. The new algorithms are a bit slower but do not have these problems:

Uniform integer ranges had a skew in the probability distribution that was not noticable for small ranges but for large ranges less than the generator’s precision the probability to produce a low number could be twice the probability for a high.

Uniform integer ranges larger than or equal to the generator’s precision used a floating point fallback that only calculated with 52 bits which is smaller than the requested range and therefore were not all numbers in the requested range even possible to produce.

Uniform floats had a non-uniform density so small values i.e less than 0.5 had got smaller intervals decreasing as the generated value approached 0.0 although still uniformly distributed for sufficiently large subranges. The new algorithms produces uniformly distributed floats on the form N * 2.0^(-53) hence equally spaced.

Every time a random number is requested, a state is used to calculate it and a new state is produced. The state can either be implicit or be an explicit argument and return value.

The functions with implicit state use the process dictionary variable rand_seed to remember the current state.

If a process calls uniform/0 or uniform/1 without setting a seed first, seed/1 is called automatically with the default algorithm and creates a non-constant seed.

The functions with explicit state never use the process dictionary.

Examples:

Simple use; creates and seeds the default algorithm with a non-constant seed if not already done:

R0 = rand:uniform(),
R1 = rand:uniform(),

Use a specified algorithm:

_ = rand:seed(exs1024s),
R2 = rand:uniform(),

Use a specified algorithm with a constant seed:

_ = rand:seed(exs1024s, {123, 123534, 345345}),
R3 = rand:uniform(),

Use the functional API with a non-constant seed:

S0 = rand:seed_s(exrop),
{R4, S1} = rand:uniform_s(S0),

Create a standard normal deviate:

{SND0, S2} = rand:normal_s(S1),

Create a normal deviate with mean -3 and variance 0.5:

{ND0, S3} = rand:normal_s(-3, 0.5, S2),

Note:

The builtin random number generator algorithms are not cryptographically strong. If a cryptographically strong random number generator is needed, use something like crypto:rand_seed/0.

For all these generators the lowest bit(s) has got a slightly less random behaviour than all other bits. 1 bit for exrop (and exsp), and 3 bits for exs1024s. See for example the explanation in the Xoroshiro128+ generator source code:

Beside passing BigCrush, this generator passes the PractRand test suite
up to (and included) 16TB, with the exception of binary rank tests,
which fail due to the lowest bit being an LFSR; all other bits pass all
tests. We suggest to use a sign test to extract a random Boolean value.

If this is a problem; to generate a boolean use something like this:

(rand:uniform(16) > 8)

And for a general range, with N = 1 for exrop, and N = 3 for exs1024s:

(((rand:uniform(Range bsl N) - 1) bsr N) + 1)

The floating point generating functions in this module waste the lowest bits when converting from an integer so they avoid this snag.

DATA TYPES

builtin_alg() =
exs64 | exsplus | exsp | exs1024 | exs1024s | exrop

alg() = builtin_alg() | atom()

alg_handler() =
#{type := alg(),
bits => integer() >= 0,
weak_low_bits => integer() >= 0,
max => integer() >= 0,
next :=
fun((alg_state()) -> {integer() >= 0, alg_state()}),
uniform => fun((state()) -> {float(), state()}),
uniform_n =>
fun((integer() >= 1, state()) -> {integer() >= 1, state()}),
jump => fun((state()) -> state())}

alg_state() =
exs64_state()
|
exsplus_state()
|
exs1024_state()
|
exrop_state()
|
term()

state() = {alg_handler(), alg_state()}

Algorithm-dependent state.

export_state() = {alg(), alg_state()}

Algorithm-dependent state that can be printed or saved to file.

exs64_state()

Algorithm specific internal state

exsplus_state()

Algorithm specific internal state

exs1024_state()

Algorithm specific internal state

exrop_state()

Algorithm specific internal state

EXPORTS

export_seed() -> undefined | export_state()

Returns the random number state in an external format. To be used with seed/1.

export_seed_s(State :: state()) -> export_state()

Returns the random number generator state in an external format. To be used with seed/1.

jump() -> NewState :: state()

Returns the state after performing jump calculation to the state in the process dictionary.

This function generates a not_implemented error exception when the jump function is not implemented for the algorithm specified in the state in the process dictionary.

jump(State :: state()) -> NewState :: state()

Returns the state after performing jump calculation to the given state.

This function generates a not_implemented error exception when the jump function is not implemented for the algorithm specified in the state.

normal() -> float()

Returns a standard normal deviate float (that is, the mean is 0 and the standard deviation is 1) and updates the state in the process dictionary.

normal(Mean :: number(), Variance :: number()) -> float()

Returns a normal N(Mean, Variance) deviate float and updates the state in the process dictionary.

normal_s(State :: state()) -> {float(), NewState :: state()}

Returns, for a specified state, a standard normal deviate float (that is, the mean is 0 and the standard deviation is 1) and a new state.

normal_s(Mean :: number(),
Variance :: number(),
State0 :: state()) ->
{float(), NewS :: state()}

Returns, for a specified state, a normal N(Mean, Variance) deviate float and a new state.

seed(AlgOrStateOrExpState ::
builtin_alg() | state() | export_state()) ->
state()

Seeds random number generation with the specifed algorithm and time-dependent data if AlgOrStateOrExpState is an algorithm.

Otherwise recreates the exported seed in the process dictionary, and returns the state. See also export_seed/0.

seed(Alg :: builtin_alg(),
Seed :: {integer(), integer(), integer()}) ->
state()

Seeds random number generation with the specified algorithm and integers in the process dictionary and returns the state.

seed_s(AlgOrStateOrExpState ::
builtin_alg() | state() | export_state()) ->
state()

Seeds random number generation with the specifed algorithm and time-dependent data if AlgOrStateOrExpState is an algorithm.

Otherwise recreates the exported seed and returns the state. See also export_seed/0.

seed_s(Alg :: builtin_alg(),
Seed :: {integer(), integer(), integer()}) ->
state()

Seeds random number generation with the specified algorithm and integers and returns the state.

uniform() -> X :: float()

Returns a random float uniformly distributed in the value range 0.0 =< X < 1.0 and updates the state in the process dictionary.

The generated numbers are on the form N * 2.0^(-53), that is; equally spaced in the interval.

Warning:

This function may return exactly 0.0 which can be fatal for certain applications. If that is undesired you can use (1.0 - rand:uniform()) to get the interval 0.0 < X =< 1.0.

If neither endpoint is desired you can test and re-try like this:

my_uniform() ->
case rand:uniform() of
0.0 -> my_uniform();

X -> X

end
end.

uniform(N :: integer() >= 1) -> X :: integer() >= 1

Returns, for a specified integer N >= 1, a random integer uniformly distributed in the value range 1 =< X =< N and updates the state in the process dictionary.

uniform_s(State :: state()) -> {X :: float(), NewState :: state()}

Returns, for a specified state, random float uniformly distributed in the value range 0.0 =< X < 1.0 and a new state.

The generated numbers are on the form N * 2.0^(-53), that is; equally spaced in the interval.

Warning:

This function may return exactly 0.0 which can be fatal for certain applications. If that is undesired you can use (1.0 - rand:uniform(State)) to get the interval 0.0 < X =< 1.0.

If neither endpoint is desired you can test and re-try like this:

my_uniform(State) ->
case rand:uniform(State) of
{0.0, NewState} -> my_uniform(NewState);

Result -> Result

end
end.

uniform_s(N :: integer() >= 1, State :: state()) ->
{X :: integer() >= 1, NewState :: state()}

Returns, for a specified integer N >= 1 and a state, a random integer uniformly distributed in the value range 1 =< X =< N and a new state.

COMMENTS