sigmoids: ReLU


ReLU:
    description:
        ReLU ket
        if x <= 0, return 0, else return x
        see: https://en.wikipedia.org/wiki/Rectifier_(neural_networks)

    examples:
        ReLU (3|a> + 2.2|b> - 3 |c> + |d>)
            3|a> + 2.200000|b> + 0|c> + |d>

    see also:
        clean, invert

Home