1 comments

  • mlepath 2 hours ago
    As ML practitioner I am not a big fan of the concept of singularity and wholly agree that IF it happens it will be more like sunrise than a light switch, but...

    I think thinking about it IS useful. Problems like value alignment are really difficult. No one alive knows what to do about value alignment and even if it takes us 500 years to reach real singularity it may take that long to solve value alignment problem