1800
Joseph Marie Jacquard teaches a loom to read punch cards, creating the first heavily multi-threaded processing unit. His invention was fiercely opposed by the silk-weavers who foresaw the birth of Skynet.
1842
Ada Lovelace gets bored of being noble and scribbles in a notebook what will later be known as the first published computer program, only slightly inconvenienced by the fact that there were no computers around at the time.
1936
Alan Turing invents everything, the British courts do not approve and have him chemically castrated.The Queen later pardoned him, but unfortunately he had already been dead for centuries at that time.
1936
Alonzo Church also invents everything with Turing, but from across the pond and was not castrated by the Queen.
1959
Grace Hopper invents the first enterprise ready business oriented programming language and calls it the “common business-oriented language” or COBOL for short.
1964
John Kemeny and Thomas Kurtz decide programming is too hard and they need to go back to basics, they call their programming language BASIC.
1970
Niklaus Wirth makes Pascal become a thing along with a number of other languages, he likes making languages.
He also invents Wirth’s law which makes Moore’s law obsolete because software developers will write so bloated software that even mainframes cannot keep up. This will later be proven to be true with the invention of Electron.js.
He also invents Wirth’s law which makes Moore’s law obsolete because software developers will write so bloated software that even mainframes cannot keep up. This will later be proven to be true with the invention of Electron.js.
1972
Dennis Ritchie got bored during work hours at Bell Labs so he decided to make C which had curly braces so it ended up being a huge success. Afterwards he added segmentation faults and other developer friendly features to aid productivity.Still having a couple of hours remaining he and his buddies at Bell Labs decided to make an example program demonstrating C, they make a operating system called Unix.
1980
Alan Kay invents object oriented programming and calls it Smalltalk, in Smalltalk everything is an object, even an object is an object. No one really has time to understand the meaning of small talk.
1987
Larry Wall has a religious experience, becomes a preacher and makes Perl the doctrine.
1983
Jean Ichbiah notices that Ada Lovelace programs never actually ran and decided to create a language with her name but the language continues to be not run.
1986
Brac Box and Tol Move decide to make an unreadable version of C based on Smalltalk which they call Objective-C but no one is able to understand the syntax.
1983
Bjarne Stroustrup travels back to the future and notices that C is not taking enough time to compile, he adds every feature he can think of to the language and names it C++.Programmers everywhere adopt it so they have genuine excuses to watch cat videos and read xkcd while working.
1991
Guido van Rossum does not like culy braces and invents Python, syntax choices were inspired by Monty Python and the Flying Circus.
1993
Roberto Ierusalimschy and friends decide they need a scripting language local to Brazil, during localization an error was made that made indices start counting from 1 instead of 0, they named it Lua.
1994
Rasmus Lerdorf makes a template engine for his personal homepage CGI scripts, he releases his dotfiles on the web.The world decides to use these dotfiles for everything and in a frenzy Rasmus throws some extra database bindings in there for the heck of it and calls it PHP.
1995
Yukihiro Matsumoto is not very happy, he notices other programmers are not happy. He creates Ruby to make programmers happy. After creating Ruby “Matz” is happy, the Ruby community is happy, everyone is happy.1995
Brendan Eich takes the weekend off to design a language that will be used to power every single web browser in the world power and eventually Skynet . He originally went to Netscape and said it was called LiveScript but Java became popular during the code review so they decided they better use curly braces and rename it to JavaScript.Java turned out to be a trademark that would get them in trouble, JavaScript later gets renamed to ECMAScript and everyone still calls it JavaScript.
1996
James Gosling invents Java, the first truly overly verbose object oriented programming language where design patterns rule supreme over pragmatism.Its super effective, the manager provider container provider service manager singleton manager provider pattern is born.
2001
Anders Hejlsberg re-invents Java and calls it C# because programming in C feels cooler than Java. Everyone loves this new version of Java for totally not being like Java.
2005
David Hanselmeyer Hansen creates a web framework called Ruby on Rails, people no longer remember that the two are separate things.
2006
John Resig writes a helper library for JavaScript, everyone thinks it’s a language and make careers of copy and pasting jQuery codes from the internets.
2009
Ken Thompson and Rob Pike decide to make a language like C, but with more safety equipment and more marketable and with Gophers as mascots.They call it Go, make it open source and sell Gopher branded kneepads and helmets separately.
2010
Graydon Hoare also wants to make a language like C, he calls it Rust. Everyone demands that every single piece of software be rewritten in Rust immediately. Graydon wants shinier things and starts working on Swift for Apple.
2012
Anders Hjelsberg wants to write C# in web browsers, he designs TypeScript which is JavaScript but with more Java in it.2013
Jeremy Ashkenas wants to be happy like Ruby developers so he creates CoffeeScript which compiles to be JavaScript but looks more like Ruby. Jeremy never became truly happy like Matz and Ruby developers.
2014
Chris Lattner makes Swift with the primary design goal of not being Objective-C, in the end it looks like Java.James Iry, whom I can only assume is a fellow computer science historian made some similar observations back in 2009
John Mccarthy should be mentioned here too. He was very important. Lisp has been used in artificial intelligence for a long time.
ReplyDelete