cancel
Showing results for 
Search instead for 
Did you mean: 

The world of intelligent algorithms: A look at machine learning

Tyler_Hofstede
2 0 5,729

The world of intelligent algorithms: A look at machine learning through the lens of AMD Instinct™ accelerators

We live in an age of rapid technological advancement. Machine learning has become a term that has permeated every facet of our lives. From the personalized recommendations suggested to you when you sit down to stream a movie, to self-driving cars navigating our streets, machine learning is what makes these innovations possible. But what exactly is machine learning, and how is it transforming the world around us?

 

A simple definition for a complex topic

Machine learning is a subset of artificial intelligence (AI) that empowers computers to learn and improve from experience, through exposure to vast amounts of data, without being explicitly programmed. It allows systems to identify patterns, adapt to new data, and make informed decisions or predictions. Unlike traditional rule-based programming, where explicit instructions dictate the actions of a program, machine learning enables algorithms to learn and evolve based on data.

The term “machine learning” is not a new one, it can be traced back to the late 1950’s, when Arthur Samuel coined the term in reference to a computer using artificial intelligence to play a game of checkers. So, if machine learning as a concept is roughly as old as NASA, why did it take so long for this concept to grab interest from the general public? In short, the increase in popularity of machine learning today can be credited to the advent of big data, as well as the massive increases in compute power, memory capacity, and bandwidth – but let’s explore more of what makes machine learning possible.

 

The Building Blocks of Machine Learning

The modern systems of today have the attributes to crunch through the immense amounts of data required for AI models to learn and improve. Without these powerful computers and sea of data, AI models would likely take too much time to provide relevant information – meaning that these systems provide a crucial element to developing a working and efficient model. The arrival of exascale computers, such as the Frontier and LUMI systems, are a great example of the increase of compute performance providing an increase to model performance.

Furthermore, the dawn of neural networks (specifically the growth in size of neural networks) and deep learning have also made machine learning workloads even more performant. This further attributes to what could be considered the “perfect storm” recipe to the growth of machine learning workload adoption.

Data, lots and lots of data. In order for machine learning algorithms to learn and provide insights to data, the algorithm requires information to learn from patterns found in data. The explosion of data creation in the last decade means that there is much more information available to “fuel” the algorithms looking for those patterns.

Speaking of algorithms, these mathematical constructs are designed to find relationships within data. For example, an algorithm written for visual inspection in the manufacturing space would be fed millions of images of a component, let’s say a connecting rod for an engine, in various states. The algorithm would then find patterns in the images and be able to classify that the part was defective, using our example, because the metal has burrs from the casting process.

The algorithm comes to these conclusions through training, which is the process of exposing it to large amounts of data (pictures of connecting rods in the example above) allowing it to find those relationships in the data. From there, the accuracy of the model is tested and evaluated by exposing it to new data. Then, comes refining the model – this is an iterative process meant to improve the model by adjusting hyperparameters, enhancing the dataset, or modifying the algorithm.

 

AMD Instinct™ accelerators powering machine learning

In healthcare, the LUMI system is enabling researchers to utilize a neural network program that can detect cancer early and quickly simulate drug efficacy. This enables pathologists to diagnose cancer growth and simulate bespoke patient reactions to various treatments so patients get the best personalized care as fast as possible.

KT Cloud has ambitious plans to introduce several new offerings including AI Cloud service for public cloud users in the form of Infrastructure-as-a-Service (IaaS), Software-as-a-Service (SaaS), and application programming interfaces (APIs) – powered by an 11B parameter Korean large language model (LLM) – for automated call center and commercial chatbot applications.

The University of Turku also took advantage of the LUMI system, the largest supercomputer in Europe, and trained a 13B parameter Finnish large language model (LLM). During the LUMI pilot, they took the 176 billion-parameter BLOOM model that Hugging Face had created and combined this with the Finnish language, using 40 billion more words.

 

The Future of Machine Learning

As technology advances, machine learning will continue to evolve. Advancements in deep learning, reinforcement learning, and quantum computing hold promise for solving even more complex problems. Though the future is unknown, it will be exciting to see what challenges can be solved that we once thought was near impossible.

 

 

Links to third party sites are provided for convenience and unless explicitly stated, AMD is not responsible for the contents of such linked sites and no endorsement is implied. GD-5

© 2023 Advanced Micro Devices, Inc. All rights reserved. AMD, the AMD Arrow logo, AMD Instinct, and combinations thereof are trademarks of Advanced Micro Devices, Inc. Other product names used in this publication are for identification purposes only and may be trademarks of their respective owners.