©2018 by macnabbs. Proudly created with Wix.com

 
  • Alibek Jakupov

Machine Learning and Music : Grace and Beauty (Ending)



Her wattled fingers can’t stroke the keys with much grace or assurance anymore, and the tempo is always rubato, halting, but still that sound—notes quivering and clear in their singularity, filing down the hallway— aches with pure intention, the melody somehow prettier as a remnant than whatever it used to be.

“Piano” by Dan Howell


In the previous article we discussed the timeline of the Artificial Intelligence from 2002 to 2010. In this article we are going to cover the next period of the fascinating history of the AI and music.


2011: Beating humans in Jeopardy, Adele and Sure Thing




Jeopardy! is an American television game show created by Merv Griffin. The show features a quiz competition in which contestants are presented with general knowledge clues in the form of answers, and must phrase their responses in the form of questions. The original daytime version debuted on NBC on March 30, 1964, and aired until January 3, 1975. A weekly nighttime syndicated edition aired from September 1974 to September 1975, and a revival, The All-New Jeopardy!, ran on NBC from October 1978 to March 1979. The current version, a daily syndicated show produced by Sony Pictures Television, premiered on September 10, 1984.Both NBC versions and the weekly syndicated version were hosted by Art Fleming. Don Pardo served as announcer until 1975, and John Harlan announced for the 1978–1979 show. Since its inception, the daily syndicated version has featured Alex Trebek as host and Johnny Gilbert as announcer.With over 8,000 episodes aired, the daily syndicated version of Jeopardy! has won a record 33 Daytime Emmy Awards as well as a Peabody Award. In 2013, the program was ranked No. 45 on TV Guide's list of the 60 greatest shows in American television history. Jeopardy! has also gained a worldwide following with regional adaptations in many other countries. The daily syndicated series' 35th season premiered on September 10, 2018.

quote from Wikipedia


In 2011 IBM's Watson beat two human champions in a Jeopardy! competition. This artificial intelligence applied machine learning, natural language processing and information retrieval techniques.

Watson is a question-answering computer system capable of answering questions posed in natural language, developed in IBM's DeepQA project by a research team led by principal investigator David Ferrucci. Watson was named after IBM's first CEO, industrialist Thomas J. Watson. The computer system was initially developed to answer questions on the quiz show Jeopardy! and, in 2011, the Watson computer system competed on Jeopardy! against legendary champions Brad Rutter and Ken Jennings winning the first place prize of $1 million. In February 2013, IBM announced that Watson software system's first commercial application would be for utilization management decisions in lung cancer treatment at Memorial Sloan Kettering Cancer Center, New York City, in conjunction with WellPoint (now Anthem). IBM Watson's former business chief, Manoj Saxena, says that 90% of nurses in the field who use Watson now follow its guidance.

quote from Wikipedia


This event marked an important step in the machine learning evolution as artificial intelligence used to have deficiencies in understanding the contexts of the clues. As a result, human players used to generate responses faster than a machine, especially to short clues.


The year when AI started bypassing human performance was also marked by the release of Adele's Rolling in the Deep, that became the 2011's most popular song.

"Rolling in the Deep" is a song recorded by English singer-songwriter Adele for her second studio album, 21. It is the lead single and opening track on the album. The song was written by Adele and Paul Epworth. The singer herself describes it as a "dark blues-y gospel disco tune". The largest crossover hit in the United States from the past 25 years, "Rolling in the Deep" gained radio airplay from many different radio formats. It was first released on 2010 as the lead single from 21 in digital download format. The lyrics describe the emotions of a scorned lover.

quote from Wikipedia


The most popular RnB song of 2011 was Sure Thing by Miguel.


2012 : Cats on YouTube, Gotye and 21



The most popular song of 2012 was "Somebody That I Used to Know", written by Belgian-Australian singer-songwriter Gotye, featuring New Zealander singer Kimbra.

"Somebody That I Used to Know" is a mid-tempo ballad. It samples Luiz Bonfá's instrumental "Seville" from his 1967 album Luiz Bonfa Plays Great Songs. The song received a positive reception from critics, who noted the similarities between the song and works by Sting, Peter Gabriel, and American folk band Bon Iver. In Australia, the song won the Triple J Hottest 100 poll at the end of 2011, as well as ARIA Awards for song of the year and best video, while Kimbra was voted best female artist and Gotye was named best male artist and producer of the year. The song came ninth in the Triple J Hottest 100 of the Past 20 Years, 2013. In 2013, the song won two Grammy Awards for Best Pop Duo/Group Performance and Record of the Year.

quote from Wikipedia

The most popular album of 2012 was 21 by Adele.


In 2012, The Google Brain team, a deep learning artificial intelligence research team at Google, led by Andrew Ng and Jeff Dean, created a neural network that learnt to recognize cats by analyzing unlabeled images captured from frames of YouTube videos.

Formed in the early 2010s, Google Brain combines open-ended machine learning research with systems engineering and Google-scale computing resources.

quote from Wikipedia

The project was particularly interesting as Google's brain simulator taught itself for object recognition.


2014 : Facebook, Happy and Frozen



In 2014 Facebook researchers published their work on DeepFace, a deep learning facial recognition system. The system used neural networks that identified faces with 97.35% accuracy. The 2014's results were an improvement of more than 27% over previous systems and rivals human performance.

DeepFace is a deep learning facial recognition system created by a research group at Facebook. It identifies human faces in digital images. It employs a nine-layer neural net with over 120 million connection weights, organized as a siamese network, and was trained on four million images uploaded by Facebook users. The system is said to be 97% accurate, compared to 85% for the FBI's Next Generation Identification system. One of the creators of the software, Yaniv Taigman, came to Facebook via their 2007 acquisition of Face.com.

quote from Wikipedia


The most popular song of 2014 was Happy by Pharrell Williams.

"Happy" is a song written, produced, and performed by American singer Pharrell Williams, from the Despicable Me 2 soundtrack album. It also served as the lead single from Williams' second studio album, Girl (2014). It was first released on November 21, 2013, alongside a long-form music video. The song was reissued on December 16, 2013, by Back Lot Music under exclusive license to Columbia Records, a division of Sony Music. "Happy" is an uptempo soul and neo soul song on which Williams's falsetto voice has been compared to Curtis Mayfield by critics. The song has been highly successful, peaking at No. 1 in the United States, United Kingdom, Canada, Ireland, New Zealand, and 19 other countries. It was the best-selling song of 2014 in the United States with 6.45 million copies sold for the year, as well as in the United Kingdom with 1.5 million copies sold for the year. It reached No. 1 in the UK on a record-setting three separate occasions and became the most downloaded song of all time in the UK in September 2014; it is the eighth highest-selling single of all time in the country. It was nominated for an Academy Award for Best Original Song. A live rendition of the song won the Grammy Award for Best Pop Solo Performance at the 57th Annual Grammy Awards.

quote from Wikipedia

And the most popular album was the Frozen soundtrack.


2016 : H.O.L.Y., 25 and Beating Humans in Go


The audience has chose H.O.L.Y as the most popular country single of 2016. The most listened pop album was 25 again by Adele.

25 is the third studio album by English singer-songwriter Adele, released on 20 November 2015 by XL Recordings and Columbia Records. Issued nearly five years after her previous album, the internationally successful 21 (2011), the album is titled as a reflection of her life and frame of mind at 25 years old and is termed a "make-up record". Its lyrical content features themes of Adele "yearning for her old self, her nostalgia", and "melancholia about the passage of time" according to an interview with the singer by Rolling Stone, as well as themes of motherhood and regret. In contrast to Adele's previous works, the production of 25 incorporated the use of electronic elements and creative rhythmic patterns, with elements of 1980s R&B and organs. Like 21, Adele worked with producer and songwriter Paul Epworth and Ryan Tedder, along with new collaborations with Max Martin and Shellback, Greg Kurstin, Danger Mouse, the Smeezingtons, Samuel Dixon, and Tobias Jesso Jr.

quote from Wikipedia


The most important achievement of 2016 was the victory of Google's AlphaGo program over an unhandicapped professional human player. It was the first Computer Go program to achieve such an impressive result. The solution combined machine learning and tree search techniques. It was later improved as AlphaGo Zero and then generalized to Chess and more two-player games with AlphaZero in 2017.

AlphaGo is a computer program that plays the board game Go. It was developed by Alphabet Inc.'s Google DeepMind in London. AlphaGo had three far more powerful successors, called AlphaGo Master, AlphaGo Zero and AlphaZero. In October 2015, the original AlphaGo became the first computer Go program to beat a human professional Go player without handicaps on a full-sized 19×19 board. In March 2016, it beat Lee Sedol in a five-game match, the first time a computer Go program has beaten a 9-dan professional without handicaps. Although it lost to Lee Sedol in the fourth game, Lee resigned in the final game, giving a final score of 4 games to 1 in favour of AlphaGo. In recognition of the victory, AlphaGo was awarded an honorary 9-dan by the Korea Baduk Association. The lead up and the challenge match with Lee Sedol were documented in a documentary film also titled AlphaGo, directed by Greg Kohs. It was chosen by Science as one of the Breakthrough of the Year runners-up on 22 December 2016. At the 2017 Future of Go Summit, its successor AlphaGo Master beat Ke Jie, the world No.1 ranked player at the time, in a three-game match (the even more powerful AlphaGo Zero already existed but was not yet announced). After this, AlphaGo was awarded professional 9-dan by the Chinese Weiqi Association. AlphaGo and its successors use a Monte Carlo tree search algorithm to find its moves based on knowledge previously "learned" by machine learning, specifically by an artificial neural network (a deep learning method) by extensive training, both from human and computer play. A neural network is trained to predict AlphaGo's own move selections and also the winner's games. This neural net improves the strength of tree search, resulting in higher quality of move selection and stronger self-play in the next iteration.

quote from Wikipedia

So why a computer program was unable to beat a human player in Go? We may find the response in its rules

Despite its relatively simple rules, Go is very complex. Compared to chess, Go has both a larger board with more scope for play and longer games, and, on average, many more alternatives to consider per move. The lower bound on the number of legal board positions in Go has been estimated to be 2 x 10^170

Conclusion


It all started with statistical methods that were first discovered and then refined. The world then saw the pioneering machine learning research, in 1950s, conducted using simple algorithms. Later Bayesian methods were introduced for probabilistic inference, followed by so-called 'AI Winter' in 1970s as people were not absolutely sure about AI effectiveness. But as the proverb says, where there is a will there is a way, and rediscovery of backpropagation caused a new wave in AI research. Shortly afterwards, in 1980s, there was a dramatic change that shifted the whole process from a knowledge-based to a data-driven approach. Scientists started creating software in order to analyze large amounts of data. The main goal was to rediscover the natural laws underlying the observations, in other words, learn from the initial data by drawing logical conclusion. At this period of the machine learning's history such algorithms as Support vector machines (SVMs) and recurrent neural networks (RNNs) become commonly used. It was now the start of the fields of computational complexity via neural networks and super-Turing computation. The early 2000s have seen the rise of Support Vector Clustering and other Kernel methods as well as unsupervised algorithms. Starting from 2010s the Deep learning became achievable that caused the appearance of a wide range of application based on machine learning algorithms.

quote from the first article


It's been a long and a fascinating way and together we tried to track the evolution of machine learning in sight of view of music timeline.


Thus, what is next? No one knows. And this what is really thrilling about machine learning. It is now our turn to make history, so up we go!


Hope you enjoyed this series of articles. See you in the next series.


References

  1. Markoff, John (17 February 2011). "Computer Wins on 'Jeopardy!': Trivial, It's Not". New York Times. p. A1. Retrieved 5 June 2016.

  2. Le, Quoc V.; Ranzato, Marc'Aurelio; Monga, Rajat; Devin, Matthieu; Corrado, Greg; Chen, Kai; Dean, Jeffrey; Ng, Andrew Y. (2012). "Building high-level features using large scale unsupervised learning" (PDF). Proceedings of the 29th International Conference on Machine Learning, ICML 2012, Edinburgh, Scotland, UK, June 26 - July 1, 2012. icml.cc / Omnipress. arXiv:1112.6209. Bibcode:2011arXiv1112.6209L.

  3. Markoff, John (26 June 2012). "How Many Computers to Identify a Cat? 16,000". New York Times. p. B1. Retrieved 5 June 2016.

  4. Taigman, Yaniv; Yang, Ming; Ranzato, Marc'Aurelio; Wolf, Lior (24 June 2014). "DeepFace: Closing the Gap to Human-Level Performance in Face Verification". Conference on Computer Vision and Pattern Recognition. Retrieved 8 June 2016.

  5. Canini, Kevin; Chandra, Tushar; Ie, Eugene; McFadden, Jim; Goldman, Ken; Gunter, Mike; Harmsen, Jeremiah; LeFevre, Kristen; Lepikhin, Dmitry; Llinares, Tomas Lloret; Mukherjee, Indraneel; Pereira, Fernando; Redstone, Josh; Shaked, Tal; Singer, Yoram. "Sibyl: A system for large scale supervised machine learning" (PDF). Jack Baskin School of Engineering. UC Santa Cruz. Retrieved 8 June 2016.

  6. Woodie, Alex (17 July 2014). "Inside Sibyl, Google's Massively Parallel Machine Learning Platform". Datanami. Tabor Communications. Retrieved 8 June 2016.

  7. "Google achieves AI 'breakthrough' by beating Go champion". BBC News. BBC. 27 January 2016. Retrieved 5 June 2016.

  8. "AlphaGo". Google DeepMind. Google Inc. Retrieved 5 June 2016.