• AI Research Insights
  • Posts
  • AI Research/Dev Super Interesting News: Open-Source Models from LG AI Research: EXAONEPath and EXAONE 3.0 and more....

AI Research/Dev Super Interesting News: Open-Source Models from LG AI Research: EXAONEPath and EXAONE 3.0 and more....

AI Research/Dev Super Interesting News: Open-Source Models from LG AI Research: EXAONEPath and EXAONE 3.0 and more....

In partnership with

Trending Open-Source Models from LG AI Research: EXAONEPath and EXAONE 3.0

Newsletter Series by Marktechpost.com

Hi There…

It was another busy week with plenty of news and updates about artificial intelligence (AI) research and dev. We have curated the top industry research updates specially for you. I hope you enjoy these updates, and make sure to share your opinions with us on social media.

EXAONEPath is designed as a patch-level foundational model that operates on WSIs, which are high-resolution images of tissue slides used in histopathology. Often containing over the billions of pixels, these images are crucial for cancer subtyping, prognosis prediction, and tissue microenvironment analysis. However, the traditional models trained on these images often suffer from a phenomenon known as WSI-specific feature collapse, where the features extracted by the model tend to cluster based on the individual WSI rather than the pathological characteristics of the tissue. This clustering can significantly limit the model’s ability to generalize across different WSIs and, consequently, its effectiveness in real-world applications.

At the core of EXAONEPath’s innovation is its approach to overcoming the WSI-specific feature collapse. This model employs self-supervised learning and stain normalization techniques, specifically Macenko normalization, to standardize the color characteristics of WSIs before feature extraction. This process reduces the variability introduced by different staining protocols across laboratories, which is a primary cause of feature collapse. By applying this normalization, EXAONEPath ensures that the features it learns are more focused on the pathologically significant aspects of the tissue, such as nuclear size and shape, cell density, and structural changes, rather than superficial color variations.....

EXAONE 3.0 represents a significant milestone in the evolution of language models developed by LG AI Research, particularly within Expert AI. The name “EXAONE” derives from “EXpert AI for EveryONE,” encapsulating LG AI Research‘s commitment to democratizing access to expert-level artificial intelligence capabilities. This vision aligns with a broader objective of enabling the general public and experts to achieve new heights of proficiency in various fields through advanced AI. The release of EXAONE 3.0 was a landmark event, marked by the introduction of the EXAONE 3.0 models with enhanced performance metrics. The 7.8 billion parameter EXAONE-3.0-7.8B-Instruct model, instruction-tuned for superior performance, was made publicly available among these. This decision to open-source one of its most advanced models underscores LG’s dedication to fostering innovation and collaboration within the global AI community.

The journey from EXAONE 1.0 to EXAONE 3.0 marks an interesting development in LG AI Research‘s development of large language models, reflecting substantial technical advancements and efficiency improvements. EXAONE 1.0, launched in 2021, laid the groundwork for LG’s ambitious AI goals, but it was in EXAONE 2.0 that critical enhancements were introduced, including improved performance metrics and cost efficiencies. The most notable leap occurred with the release of EXAONE 3.0, where a three-year focus on AI model compression technologies resulted in a dramatic 56% reduction in inference processing time and a 72% reduction in cost compared to EXAONE 2.0. This culminated in a model operating at just 6% of the initially released EXAONE 1.0 cost. These improvements have increased the model’s applicability in real-world scenarios and made advanced AI more accessible and economically feasible for broader deployment across various industries....

Trending Feeds…

➡️ We’re excited to introduce @ChaiDiscovery and release Chai-1, a foundation model for molecular structure prediction that performs at the state-of-the-art across a variety of drug discovery tasks [Tweet]

➡️ This AI Paper from Apple Introduces AdEMAMix: A Novel Optimization Approach Leveraging Dual Exponential Moving Averages to Enhance Gradient Efficiency and Improve Large-Scale Model Training Performance [Tweet]

➡️ Scale AI Proposes PlanSearch: A New SOTA Test-Time Compute Method to Enhance Diversity and Efficiency in Large Language Model Code Generation [Tweet]

➡️ The whole Reflection-70B debacle points the the desperate need for a better AI evaluation ecosystem. [Tweet]

➡️ Introducing RobustSAM by @Snap Research! [Tweet]

Wanna get in front of 1 Million+ Data Scientists, developers, AI engineers, CTOs???

Sponsor a newsletter or social post