AI Companies are Nothing Without PyTorch

Since Facebook (now Meta) open-sourced PyTorch in 2017, there’s been no looking back for the AI ecosystem. Today, most tech companies, including heavyweights like OpenAI, Tesla, Microsoft, use PyTorch because of the dynamic nature of its computational graphs, which allows for rapid prototyping and experimentation.

Researchers can easily test new ideas and iterate on their models without the need for extensive reconfiguration. This flexibility is a significant advantage in the fast-paced world of AI research, where innovation and quick adaptation are key.

The Rise of PyTorch

PyTorch usage has steadily increased within the research community. For example, nearly 70% of recent machine learning papers use PyTorch, compared to just 4% using TensorFlow.

Number of repositories for PyTorch
(Source: GitHub)

Furthermore, in 2022 alone, 45,000 PyTorch exclusive models were added to Hugging Face, whereas only 4,000 new TensorFlow exclusives were added. This resulted in 92% exclusive models on Hugging Face for PyTorch.

Surprisingly, all 30 of the most popular models on Hugging Face are available in PyTorch, and none are exclusive to TensorFlow, which points to PyTorch’s popularity and high adoption rate compared to any other framework.

Top 30 AI models are exclusive to PyTorch
(Source: GitHub)

Needless to say, PyTorch has become the top choice in the research community due to its less complex nature and features like a dynamic computational graph and Pythonic nature.

PyTorch Fuels Tesla

Tesla employs a HydraNet architecture, which is a giant neural network that handles multiple tasks simultaneously. This architecture is trained using PyTorch, leveraging its dynamic computation graphs and distributed training capabilities to manage the complexity of tasks like lane detection, pedestrian tracking, and traffic signal recognition.

Tesla’s DOJO supercomputer uses PyTorch for training specific parts of their neural network architecture, optimising the training process and improving the performance of their self-driving models.

Tesla collects data from its fleet of vehicles and uses PyTorch to train models on this real-world data. This continuous loop of data collection, labelling, and training helps Tesla improve its AI systems in real time.

The reliance of Tesla on PyTorch is so much that even Elon Musk said he hates Facebook, but still uses tech developed by Facebook (PyTorch).

Tesla AI engineers and scientists be like: "can we still use PyTorch though?"https://t.co/TFEOVSqOoj https://t.co/y2hts0exTq

— Yann LeCun (@ylecun) May 15, 2020

Meta’s AI is All About PyTorch

Meta has migrated all its AI systems to PyTorch, which is used for applications such as content recommendation, image recognition, and natural language processing (NLP).

By standardising PyTorch, Meta has unified its research and production pipelines, allowing for seamless transitions from experimentation to deployment. This unification has led to more efficient and scalable AI model development.

“We didn’t have a grand vision for what PyTorch could be in the world. We built it for ourselves and it ended up being something that was useful for other people. At the same time, the addressable market has expanded a lot, and that made it a bigger deal than we ever thought,” Soumith Chintala, a researcher at Meta, said, expressing how PyTorch took off unexpectedly.

Meta operates over 1,700 PyTorch-based inference models in full production, performing trillions of inference operations daily. This extensive use underscores PyTorch’s reliability and scalability for large-scale AI applications.

Airbnb and PyTorch

Airbnb uses PyTorch to power its customer service dialogue assistant, which enhances customer experience through smart replies.

Airbnb treats the smart replies recommendation model as a machine translation problem, using PyTorch’s sequence-to-sequence models to translate customer input messages into agent responses. This approach leverages PyTorch’s advanced attention mechanisms and beam search capabilities.

Furthermore, by integrating PyTorch into their dialogue assistant, Airbnb has improved the accuracy and relevance of automated responses, leading to a better customer service experience.

Blue River Technology Farms AI Through PyTorch

PyTorch’s flexibility allows Blue River Technology to quickly adapt its models to the constantly changing conditions in agricultural fields. This adaptability is crucial for addressing new challenges that arise daily.

“We chose PyTorch because it’s very flexible and easy to debug. New team members can quickly get up to speed, and the documentation is thorough. The framework gives us the ability to support production model workflows and research workflows simultaneously,” said Chris Padwick, the director of computer vision machine learning at Blue River.

Blue River Technology uses weights and biases for experiment tracking and model visualisation, which integrates seamlessly with PyTorch. This integration provides real-time insights into model performance and helps in debugging and improving models.

PyTorch enables rapid experimentation and productionisation of models, ensuring that Blue River Technology’s field machines operate with high accuracy and speed.

PyTorch Lights up Genentech’s Research

PyTorch’s dynamic computation graphs and ease of use facilitate the development of complex models for predicting molecular properties, accelerating the drug discovery process.

“At Genentech, PyTorch is being used to develop personalised cancer medicines as well as for drug discovery and in cancer therapy,” said Daniel Bozinov, head of AI – early clinical development informatics, Genentech.

Genentech uses PyTorch to develop models that can identify cancer-specific molecules, enabling personalised cancer treatments. This application leverages PyTorch’s flexibility and control structures for precise model tuning.

PyTorch’s ability to transition seamlessly from research to production allows Genentech to quickly deploy new models and integrate them into their therapeutic workflows.

Microsoft and PyTorch

“We take things to another level with Azure machine learning because we can basically create infrastructure to train our models as we go. We’ve found PyTorch and Azure machine learning to be very powerful together,” said Jeremey Jancsary, senior research scientist at Microsoft.

Microsoft uses PyTorch for its language modelling service, which includes state-of-the-art language models for various applications.

Microsoft has also built an internal language modelling toolkit on top of PyTorch, which improved the onboarding of new users and facilitated the development of advanced/custom tasks and architectures.

Furthermore, PyTorch’s native extensibility allowed Microsoft to scale its language modelling features to billions of words, enhancing the performance and accuracy of its language models.

Why Not TensorFlow?

While TensorFlow remains a powerful and mature framework with strong production capabilities, it struggles to compete with PyTorch in several key areas. PyTorch’s ease of use, flexibility, and dynamic computation graphs make it the preferred choice for researchers and developers who prioritise rapid prototyping and experimentation.

Its intuitive design and deep integration with Python have fostered a vibrant community that continues to drive innovation in AI. On the other hand, TensorFlow’s strengths lie in its scalability and robust performance, making it ideal for large-scale production environments.

Its extensive ecosystem and support for distributed training ensure that it remains a valuable tool for deploying high-performance AI models.

The post AI Companies are Nothing Without PyTorch appeared first on Analytics India Magazine.

Follow us on Twitter, Facebook
0 0 votes
Article Rating
Subscribe
Notify of
guest
0 comments
Inline Feedbacks
View all comments

Latest stories

You might also like...