Intel Labs to Present Industry-Leading AI Research at NeurIPS 2023
Intel Labs will highlight 31 research projects that are shaping the future of AI innovation.
*** video ***
Intel Labs is showcasing some of its most significant and industry-first AI innovations at NeurIPS 2023 — the leading global event for developers, researchers, and academic professionals focused on AI and computer-vision technologies. The event takes place Dec. 10-16 in New Orleans.
At NeurIPS 2023, Intel Labs will present industry-leading AI research and share the company’s “AI Everywhere” vision with a diverse community of innovators and thought leaders. During the conference, Intel Labs will present 31 papers, including 12 main conference papers and 19 workshop papers and demos in booth #405. The research is focused on novel models, methods and tools for AI applications in science as well as graph learning, multi-modal generative AI, and AI algorithms and optimization technologies for use across AI use cases including climate modeling, drug discovery and materials science.
In addition, Intel Labs will host the “AI for Accelerated Materials Discovery (AI4Mat) Workshop” on Dec. 15, which will provide a platform for AI researchers and material scientists to tackle challenges in AI-driven materials discovery and development.
Research that Intel Labs will present at NeurIPS 2023 can be categorized as follows, each featuring impactful findings:
AI for Science
- Brain encoding models: Based on multimodal transformers, models co-developed with researchers at the University of Texas at Austin that can predict brain responses, particularly in cortical regions that represent conceptual meaning and provide insights into the brain’s capacity for multimodal processing.
- ClimateSet: Large-scale climate model dataset developed with the Quebec Artificial Intelligence Institute (Mila) for machine learning that can quickly project new climate change scenarios and create the basis for the machine learning (ML) community to build disruptive climate centric applications.
- HoneyBee: State-of-the-art LLM co-developed with Mila for researchers to understand materials science more quickly.
Multimodal Generative AI
- COCO-Counterfactuals: A multimodal technique to generate synthetic counterfactual data that mitigates incorrect statistical biases in pre-trained multimodal models to help improve the performance of AI models on many downstream tasks, like image-text retrieval and image recognition.
- LDM3D-VR: Latent diffusion model for 3D virtual reality (VR) to simplify 3D video generation for AI applications.
- CorresNeRF: Image rendering method using neural radiance fields for reconstructing a 3D representation of a scene from 2D images.
Improving AI Performance
- DiffPack: A generative AI method for protein modeling to help ensure that generated 3D structures reflect real-world structural properties of proteins.
- InstaTune: A method that generates a super-network during the fine-tuning stage to reduce the overall time and compute resources required for networked attached storage (NAS).
Graph Learning
- A*Net: The industry’s first path-based method for knowledge graph reasoning with a million-scale dataset, enabling scaling to datasets beyond computational reach and improving the accuracy of large language models (LLMs).
- ULTRA: The industry’s first foundation model for knowledge graph reasoning and a new approach for learning universal and transferable graph representations and their relationships.
- Perfograph: A novel compiler graph-based program representation that can capture numerical information and the composite data structure to improve the ability of ML methods to reason about programming languages.
To learn more about the complete collection of research highlighted at the event, read the in-depth event post and watch Intel Labs’ video at the top of this page, or visit Intel Labs at NeurIPS (Booth #405).
Additional information on the event can also be found on the NeurIPS 2023 website.
Editor’s Note: The brain encoding models section of this news post was updated after publication on Dec. 6, 2023.