SOPHON Micro Server SE3 is an micro server designed for deep learning. Equipped with the self-developed TPU chip BM1682 by SOPHGO, it provides up to 3TFLOPS floating-point 32-bit peak computing power. It can process 4-channel HD videos at the same time and cooperate with diversified algorithms to realize face comparison, control analysis, video structure, object recognition, etc.
It can be connected to different types of collection devices to calculate the nearest data to achieve real-time response and edge control
Deploy various intelligent algorithms and applications of customers and partners to achieve rapid AI empowermentfor various industry application scenarios
The "edge-cloud" collaborative distributed architecture distributes the computing load between the edge and the cloud, which can not only achieve scalable and ultra-large-scale management, butalso have the characteristics of real-time response and control of the edge
3TFLOPs
Support up to 50,000 face database, 0.5 second for each recognition
Support 4 video streams recognition or 10 image stream recognition
Support various algorithms such as person / vehicle / non-vehicle/ object recognition, video structuring, trajectory analysis, etc.
Support intelligent park/security/industrial control/business and scenarios for flexible deployment
Support Caffe/TensorFlow/PyTorch/MXNet/Paddle Lite and other mainstream deep learning frameworks
Integrate multiple intelligent algorithms to support edge computing such as face recognition, vehicle recognition, object recognition, video structurization and behavior analysis.
BMNNSDK (BITMAIN Neural Network SDK) one-stop toolkit provides a series of software tools including the underlying driver environment, compiler and inference deployment tool. The easy-to-use and convenient toolkit covers the model optimization, efficient runtime support and other capabilities required for neural network inference. It provides easy-to-use and efficient full-stack solutions for the development and deployment of deep learning applications. BMNNSDK minimizes the development cycle and cost of algorithms and software. Users can quickly deploy deep learning algorithms on various AI hardware products of SOPHGO to facilitate intelligent applications.
Chip
Model
SOPHON BM1682
CPU
4-core A53@1.8GHz
AI Performance
FP32
Peak performance 3TFLOPS
Memory and storage
Memory
8GB
eMMC
32GB
External interface
Ethernet interface
10/100/1000Mbps adaptive
Storage
MicroSD *1
Mechanical
Length * width * height
210mm * 115mm * 45mm
Power supply and power consumption
Power supply
DC 12V
Typical power consumption
≤35W (depending on the configuration)
Temperature and humidity
Working temperature
-10℃ ~ +45℃
Humidity
10% ~ 90%, no condensation