Xelera Silva provides best-in-class throughput and latency for XGBoost, LightGBM and Random Forest inference by leveraging COTS datacenter-grade FPGA accelerators.
10x higher throughput than GPUs, 50x higher throughput than CPUs.
Microseconds query latency without jitter.
Usable with standard machine learning frameworks and usable with zero code change.
Built to run out-of-the-box on COTS servers and datacenter FPGA hardware.
Gradient Boosting frameworks such as XGBoost, LightGBM and CatBoost, as well as Random Forest algorithms are widely used in ransomware and DDOS detection systems, recommender systems, and trading systems.
The network is the first instance that must be protected to secure the enterprise IT system. Firewalls using Gradient Boosting Decision Trees perform application-level inspection to detect suspicious patterns in network flow data (e.g., in the case of a DDOS attack). Silva provides the necessary inference speed to ensure that the computationally intensive classification of network flows can be performed at the required bandwidth to avoid slowing down the network system.
Algorithmic trading uses decision algorithms to automate trading instructions. The automated decisions are often made by XGBoost and LightGBM machine learning models. Low execution latency is an important goal in these systems. Silva provides an API for trading software frameworks that performs inference of XGBoost and LightGBM models with a latency of a few microseconds. This enables clients to execute trading instructions faster than their competitors.