A new paper from researchers at CMU and others reveals systemic vulnerabilities in current techniques aimed…
Category: ARTICLE
Articles and other larger forms like tutorials and analysis for anyone wanting to learn more about how AI is progressing.
Scaling TransNormer to 175 Billion Parameters
The field of natural language processing has seen monumental advances with the rise of large language…
Reasoning or Rambling? New Study Questions Logic Behind AI Reasoning
A new paper from Stanford researchers calls into question whether prompting large language models like GPT-3…
Calibration Techniques Improve Probability Estimates from Machine Learning Models
A recent study published in the Proceedings of Machine Learning Research investigated methods for calibrating probabilistic…
New Benchmark Tests the Limits of AI Reasoning Abilities
A new benchmark dataset called the Advanced Reasoning Benchmark (ARB) aims to push artificial intelligence systems…
New Web Agent to Navigate Real Websites
A team of researchers from Google DeepMind and University of Tokyo has developed a new web…
BTLM-3B-8K: Performance in a 3 Billion Parameter Model
A new language model called BTLM-3B-8K has achieved state-of-the-art accuracy among 3 billion parameter models, rivaling…
New Research Investigates Faithfulness of Reasoning from AI Systems
A new paper from Anthropic researchers explores whether the reasoning that large language models (LLMs) provide…
Can AI-Generated Text be Reliably Detected? New Research Raises Doubts
A new research paper from the University of Maryland has cast doubts on the reliability of…
Evaluating AI Systems Beyond Human Abilities
As artificial intelligence systems continue to advance, researchers are faced with a new challenge – how…
New Framework Unifies Diverse Conversational AI Datasets
New DialogStudio Framework Unifies Diverse Conversational AI Datasets. Key iformation: The ability of conversational AI systems…
Re-evaluating Claims of Speedups from Efficient Training Algorithms
Training massive neural networks requires extraordinary amounts of computation, often costing millions of dollars and emitting…
GPT-4’s Alleged Decline Over Time: A Misinterpretation
The Truth Behind Claims of GPT-4’s Declining Performance A new paper analyzing different versions of GPT-3.5…
Economists Exposed: New Study Cracks Anonymity of Controversial Job Forum
A new research paper has uncovered flaws in the anonymization system used on the popular Economics…
Monitoring ChatGPT Drifts Reveals Substantial Behavior Changes Over Time
A new paper by researchers at Stanford University and UC Berkeley reveals that the behavior of…
Llama 2: An Open Large Language Model Matching Proprietary Chatbots
A new large language model called Llama 2 was recently open-sourced by researchers at Meta AI.…
[Article] Faster Transformers for Longer Context with FlashAttention-2
Researchers from Stanford University have developed a new technique called FlashAttention-2 that can significantly speed up…
[Article] Retentive Networks: The Next Evolution of Transformers for AI?
A new paper from researchers at Microsoft proposes a novel neural network architecture called Retentive Networks…
Faster Optimization with Counterintuitively Long Steps
A new study by Benjamin Grimmer at Johns Hopkins University has demonstrated that the classic gradient…
New Framework Generates Commonsense Knowledge with Smaller AI Models
Researchers at the Allen Institute for AI have developed a novel framework called I2D2 that can…