top of page
  • Writer's pictureAlibek Jakupov

Machine Learning and Music : Grace and Beauty (part III)

Updated: Nov 19, 2021




I am in need of music that would flow Over my fretful, feeling fingertips, Over my bitter-tainted, trembling lips, With melody, deep, clear, and liquid-slow. Oh, for the healing swaying, old and low, Of some song sung to rest the tired dead, A song to fall like water on my head, And over quivering limbs, dream flushed to glow!

Elizabeth Bishop, ‘I Am in Need of Music’


In the previous article we discussed the timeline of the Artificial Intelligence from 1913 to 1957.


In this article we are going to cover the next period of the fascinating history of the AI and music.



1963: Tic-Tac-Toe, West Side Story and Little Johnny Taylor



Donald Michie was a British researcher in machine learning who worked for the Government Code and Cypher School at Bletchley Park during World War II. During this work he tried to solve a German teleprinter cipher. called "Tunny,". In 1963 he created a 'machine' to play Tic-tac-toe. This machine consisted of 304 match boxes and beads. In this research he applied reinforcement learning.


In 1963, as a year before, the most listened pop album was the West Side Story soundtrack. Released in 1961, the soundtrack spent 54 weeks at No. 1 on Billboard's album charts. This gave it the longest run at No. 1 of any album in history.


The most popular soul single was "Part Time Love" by Little Johnny Taylor. Little Johnny Taylor an American blues and soul singer, who made recordings throughout the 1960s and 1970s.



1967: Nearest Neighbors, Aretha Franklin and Greatest Hits




Everyone learning machine learning is familiar with the nearest neighbors algorithm. It was one of the first approaches to solve the famous traveling salesman problem.


These are the steps of the algorithm:
  1. Initialize all vertices as unvisited.

  2. Select an arbitrary vertex, set it as the current vertex u. Mark u as visited.

  3. Find out the shortest edge connecting the current vertex u and an unvisited vertex v.

  4. Set v as the current vertex u. Mark v as visited.

  5. If all the vertices in the domain are visited, then terminate. Else, go to step 3.


This step marked the start of basic pattern recognition.


The most popular soul composition of this year was "Respect" by Aretha Franklin. It is important to mention that the music in the two versions is significantly different. The most listened album of the same genre wast Greatest Hits by The Temptations. This album was released by the Gordy (Motown) label and peaked at #5 on the Billboard 200 album chart.



1969: Marvin Minsky, Seymour Papert and Sugar, Sugar



"Sugar, Sugar", a song written by Jeff Barry and Andy Kim, was originally recorded by the virtual band the Archies. The single reached number one in the US on the Billboard Hot 100 chart in 1969 and remained there for four weeks. The most listened country album was Wichita Lineman by Glen Campbell.


This year also marked the beginning of the pessimism about the AI performance. In 1969 Marvin Minsky and Seymour Papert published their book Perceptrons. This work described some of the limitations of perceptrons and neural networks. One of the ideas underlying this research was that neural networks were fundamentally limited which caused a hindrance for later investigations into neural networks.



1970: Backpropagation, the Jacksons and Bridge over Troubled Water




The Jackson 5 (stylized as the Jackson 5ive), later known as the Jacksons, were an American pop band composed of members of the Jackson family. The group was founded in 1964 in Gary, Indiana by brothers Jackie, Tito, and Jermaine, with younger brothers Marlon and Michael Jackson joining soon after.

In 1970 "I'll Be There", a soul song written by Berry Gordy, Hal Davis, Bob West, and Willie Hutch and recorded by The Jackson 5, became the year-end number-one single. The most listened pop single became "Bridge over Troubled Water" by Simon & Garfunkel.


In 1970 Seppo Linnainmaa, a Finnish mathematician and computer scientist, published the general method for automatic differentiation (AD) of discrete connected networks of nested differentiable functions. Data Scientists may recognize the backpropagation algorithm. Indeed, as this work described the approach that we know today as backpropagation.



References:

  1. Child, Oliver. "Menace: the Machine Educable Noughts And Crosses Engine Read". Chalkdust Magazine. Retrieved 16 Jan 2018.

  2. Cohen, Harvey. "The Perceptron". Retrieved 5 June 2016.

  3. Colner, Robert. "A brief history of machine learning". SlideShare. Retrieved 5 June 2016.

  4. Seppo Linnainmaa (1970). "The representation of the cumulative rounding error of an algorithm as a Taylor expansion of the local rounding errors." Master's Thesis (in Finnish), Univ. Helsinki, 6–7.

  5. Linnainmaa, Seppo (1976). "Taylor expansion of the accumulated rounding error". BIT Numerical Mathematics. 16 (2): 146–160. doi:10.1007/BF01931367.

  6. Griewank, Andreas (2012). "Who Invented the Reverse Mode of Differentiation?". Documenta Matematica, Extra Volume ISMP: 389–400.

  7. Griewank, Andreas and Walther, A. Principles and Techniques of Algorithmic Differentiation, Second Edition. SIAM, 2008.

  8. Schmidhuber, Jürgen (2015). "Deep learning in neural networks: An overview". Neural Networks. 61: 85–117. arXiv:1404.7828. Bibcode:2014arXiv1404.7828S.

  9. Schmidhuber, Jürgen (2015). Deep Learning. Scholarpedia, 10(11):32832. Section on Backpropagation

26 views0 comments

Recent Posts

See All
bottom of page