Yearly Archives: 2021

A New DeepMind Research Studies Language Modeling At Scale

Language is an essential human component because of its role in demonstrating and promoting comprehension - or intellect. It allows people to express ideas,...

IBM Research Introduces ‘CodeNet’: A Large-Scale Dataset Aimed at Teaching AI to Code For Machine Learning Models

If you have checked your bank account, used a credit card, gone to the doctor, booked a ticket, paid your taxes, or bought anything...

Google AI’s New Study Focusses on Understanding What Linguistic Information is Captured by Language Models?

Language models which are pre-trained, such as BERT and GPT-3, have been getting increasingly popular in Natural Language Processing (NLP) in recent years. Language...

The Researchers Propose a Family of Next Generation Transformer Models That Use Sparse Layers to Scale Efficiently and Perform Unbatched Decoding Much Faster than...

Large-scale transformer systems have drastically enhanced natural language processing (NLP) tasks. The original Transformer significantly improved the state-of-the-art in machine translation. However, the advantages...

AMD Introduces Its MI200 Series GPUs: An Exascale-Class GPU Accelerators For Deep Learning

Scientific endeavors such as weather forecasting, climate modeling, analyzing new energy sources, drug discovery, training AIs, or running any form of large-scale simulation require...

MIT Researchers Propose ‘GeoMol’: A Deep Learning Model That Predicts The 3D Shapes Of Drug-Like Molecules

Scientists look for drug-like compounds that can connect to disease-causing proteins and modify their functionality to search for new therapies. They must grasp a...

Meta AI Develops A Conversational Parser For On-Device Voice Assistants

A variety of devices such as computers, smart speakers, cellphones, etc., utilize conversational assistants for helping users with tasks ranging from calendar management to...

AWS Launches ‘SageMaker Studio Lab’: A Free Tool To Learn and Experiment With Machine Learning

AWS introduced SageMaker Studio Lab, a free offering to assist developers master machine learning techniques and experimenting with the technology, at its re: Invent...

Researchers from Sea AI Lab and National University of Singapore Introduce ‘PoolFormer’: A Derived Model from MetaFormer for Computer Vision Tasks

The main hype of the last few years in the world of Deep Learning is definitely Transformers. Since their advent in 2017 with the...

Google AI Introduces MURAL (Multimodal, Multi-task Retrieval Across Languages) For Image–Text Matching

There is no straight one-to-one translation from one language to another for many concepts. Even when there is, such translations typically contain various connections...

LinkedIn Formulates Entity Inference As An Inference Problem Using Graph Neural Networks

LinkedIn allows its users to add information about themselves to their profiles, such as their career history, education, abilities, to name a few. They...

Purdue University Researchers Introduce A Compositional Reader Model That Composes Multiple Documents In One Shot To Form A Unified Political Entity Representation

Natural language processing (NLP) is an area of computer science—more specifically, a branch of artificial intelligence (AI)—that deals with computers' capacity to understand the...

Recent articles

spot_img

Check Out Our Super Cool AI Research Newsletter While It's Still Free

X