Technology

LOCI Technology Evolution

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Ut elit tellus, luctus nec ullamcorper mattis, pulvinar dapibus leo.

OTA
2017
Pioneering NLP for Software Code

Aurora Labs develops NLP techniques for analyzing software code, focusing on automotive and critical safety sectors.

DETECT
2018-2019
Advanced Model Integration

The team incorporates cutting-edge transformers and develops real-time tracing capabilities for System-on-Chips.

LOCI
2020-2022
Improving NLP to LCLM

Aurora Labs introduces the Large Code Language Model (LCLM) with a 1000x more efficient vocabulary.

LOCI
2022-Present
Continuous Refinement and Breakthrough

Ongoing improvements to LOCI's predictive analytics, shift-left testing, and code quality insights, 97% model accuracy

OTA
2017
Pioneering NLP for Software Code

Aurora Labs develops NLP techniques for analyzing software code, focusing on automotive and critical safety sectors.

DETECT
2018-2019
Advanced Model Integration

The team incorporates cutting-edge transformers and develops real-time tracing capabilities for System-on-Chips.

LOCI
2020-2022
Improving NLP to LCLM 

Aurora Labs introduces the Large Code Language Model (LCLM) with a 1000x more efficient vocabulary.

LOCI
2022-Present
Continuous Refinement and Breakthrough

Ongoing improvements to LOCI's predictive analytics, shift-left testing, and code quality insights, 97% model accuracy

Large Code Language Model - LCLM

LOCI leverages Aurora Labs’ proprietary vertical LLM, known as Large Code Language Model (LCLM), that is specifically designed for compiled binaries.

Unlike general-purpose Large Language Models (LLMs), LCLM delivers superior, efficient, and accurate binary analysis and detection of software behavior changes on targeted hardware, offering deep contextual insights into system-wide impacts – without the need for source code.

The LCLM analyzes software artifacts and transforms complex data into meaningful insights. Unlike existing Large Language Models (LLM), LCLMI’s vocabulary is highly efficient (x1000 smaller) with reinvented tokenizers and effective pipeline training using only 6 GPUs.

This LCLM drives LOCI – our Line-Of-Code Intelligence technology platform.
LOCI goes beyond the static analysis of software and understands the context of the software behavior within a given functional flow. These insights enable LOCI to detect deviations that are anomalies to the expected and predicted software behavior before production deployment.

This LCLM drives LOCI – our Line-Of-Code Intelligence technology platform.

LOCI goes beyond the static analysis of software and understands the context of the software behavior within a given functional flow. These insights enable LOCI to detect deviations that are anomalies to the expected and predicted software behavior before production deployment.

Key Differentiators:

Benefits:

Key Features

System Coverage
Achieve full system coverage using symbol names, eliminating the need for suppliers’ source code. This enhances safety and understanding among SOC Cores islands running QM or ASILx software components while also addressing time and performance impacts.

Time and Performance Degradation Prediction
Predict time and performance degradation across different suppliers’ versions without needing the source code, allowing for root cause analysis. This approach increases system reliability and quality, ensuring functionality uptime and availability for users.

Test Strategy
Based on behavioral analysis, LOCI Indicate which symbols to focus on during unit and system tests. This increases the coverage of system test scenarios and enhances functional quality.

Power optimization (available Q2, 25)
AI-driven suggestions for reducing energy usage without compromising performance.
Track energy consumption across all Functions and tasks, indicating where to optimize energy

Skip to content