Please create an account to participate in the Slashdot moderation system

typodupeerror
AI

AlexNet, the AI Model That Started It All, Released In Source Code Form (zdnet.com) 8

An anonymous reader quotes a report from ZDNet: There are many stories of how artificial intelligence came to take over the world, but one of the most important developments is the emergence in 2012 of AlexNet, a neural network that, for the first time, demonstrated a huge jump in a computer's ability to recognize images. Thursday, the Computer History Museum (CHM), in collaboration with Google, released for the first time the AlexNet source code written by University of Toronto graduate student Alex Krizhevsky, placing it on GitHub for all to peruse and download.

"CHM is proud to present the source code to the 2012 version of Alex Krizhevsky, Ilya Sutskever, and Geoffery Hinton's AlexNet, which transformed the field of artificial intelligence," write the Museum organizers in the readme file on GitHub. Krizhevsky's creation would lead to a flood of innovation in the ensuing years, and tons of capital, based on proof that with sufficient data and computing, neural networks could achieve breakthroughs previously viewed as mainly theoretical.
The Computer History Museum's software historian, Hansen Hsu, published an essay describing how he spent five years negotiating with Google to release the code.
This discussion has been archived. No new comments can be posted.

AlexNet, the AI Model That Started It All, Released In Source Code Form

Comments Filter:
  • describing how he spent five years negotiating with Google to release the code.

    The most surprising part of this story is that he could get anyone to talk to him.

    • by Anonymous Coward

      "In 2020, I reached out to Alex Krizhevsky to ask about the possibility of allowing CHM to release the AlexNet source code, due to its historical significance. He connected me to Geoff Hinton, who was working at Google at the time."

      Personal introduction. It's not like he just filled a support request or whatever.

  • "Why is the discussion around AlexNet and its historical impact ignoring the attention mechanism? 'Attention is All You Need' showed that attention-based architectures like Transformers donâ(TM)t even require convolutional or recurrent neural networks. Given that attention has now largely supplanted traditional neural networks in many domains, isn't it time to reframe AI history beyond just the AlexNet breakthrough?"

    • Because I think it's different steps...
      AlexNet changed the discussion around machine learning.
      Prior to it, it was honestly believed by most experts that neural networks would never outperform traditional methods of machine learning- that supervised training was a necessity for performance.
      AlexNet showed that all to be flat out wrong, when it outperformed everything with automated training on nothing but data.

      "Attention Is All You Need" revolutionized what you can do with a neural network (since it basi
    • There is no connection between AlexNet and Transformers, so I'm not sure why you are considering them together. They represent totally different kinds of breakthough.

      AlexNet's historical importance is in kick-starting the modern "deep learning" neural network revolution. It wasn't an architectural breakthough, but rather a demonstration of what could be achieved at scale and using GPU acceleration.

      The interest in the AlexNet source code is presumably because that is what was special about it - it was a hand

"Just think, with VLSI we can have 100 ENIACS on a chip!" -- Alan Perlis

Working...
close