Large Language Model

A New AI Research From MIT Reduces Variance in Denoising Score-Matching, Improving Image Quality, Stability, and Training Speed in Diffusion Models

Diffusion models have recently produced outstanding results on various generating tasks, including the creation of images, 3D point clouds, and molecular conformers. Ito stochastic...

Researchers at IBM Propose MoLFormer-XL: A Pretrained Artificial Intelligence AI Model That Infers The Structure of Molecules From Simple Representations

Recent technological developments have led to the widespread adoption of large pretrained models for performing several tasks. These models, which could previously summarise texts...

Welcome to the New Saga Introduced by MusicLM: This AI Model can Generate Music from Text Descriptions

There has been an explosion of generative AI models in the last couple of months. We’ve seen models that could generate realistic images from...

Meet AudioLDM: A Latent Diffusion Model For Audio Generation That Trains On AudioCaps With A Single GPU And Achieves SOTA Text-To-Audio (TTA) Performance

For many applications, like augmented and virtual reality, game creation, and video editing, it is crucial to produce sound effects, music, or speech by...

Researchers at the University of Warsaw Developed a Deep Learning (DL) Dialogue System Called Serena for Mental Health Therapy

One of the largest problems in the world is still that mental health therapy is not widely accessible. According to estimates, 658 million individuals...

Microsoft Research Proposes BioGPT: A Domain-Specific Generative Transformer Language Model Pre-Trained on Large-Scale Biomedical Literature

With recent technological breakthroughs, researchers have started employing several machine learning techniques on the abundance of biomedical data that is available. Using techniques like...

A New AI-Based Method Called SparseGPT can Prune Generative Pretrained Transformer (GPT) Family Models in One Shot to at least 50% Sparsity

Amazing performance across many tasks has been demonstrated by the Generative Pretrained Transformer (GPT) family of large language models (LLMs). However, they are cumbersome...

This Artificial Intelligence AI Framework Called MPCFormer Enables Private Inference With Secure Multiparty Computation (MPC) For Transformers (Copilot, ChatGPT, OPT)

Transformer models already trained can execute various downstream tasks with excellent performance before being used as model inference services. Such model inference services, however,...

Microsoft AI Research Proposes eXtensible Prompt (X-Prompt) for Prompting a Large Language Model (LLM) Beyond Natural Language (NL)

Due to their capacity to produce text comparable to human-written material and their versatility in various natural language processing (NLP) applications, large language models...

Meet SymbolicAI: The Powerful Framework That Combines The Strengths Of Symbolic Artificial Intelligence (AI) And Large Language Models

Latest innovations in the field of Artificial Intelligence have made it possible to describe intelligent systems with a better and more eloquent understanding of...

Researchers at the University of Maryland Propose Cold Diffusion: A Diffusion Model with Deterministic Perturbations

Diffusion models can be interpreted as stochastic encoder/decoder architectures, which are built around a residual architecture that successively applies a learned transformation. To this,...

Meet HyDE: An Effective Fully Zero-Shot Dense Retrieval Systems That Require No Relevance Supervision, Works Out-of-Box, And Generalize Across Tasks

Dense retrieval, a technique for finding documents based on similarities in semantic embedding, has been shown effective for tasks including fact-checking, question-answering, and online...

Recent articles

Check Out Our Super Cool AI Research Newsletter While It's Still Free

X