Huawei’s latest data center switch uses an embedded artificial intelligence (AI) chip to boost network performance.
The vendor announced the new product at its spring 2019 event, and it follows news that Intel and Facebook have an AI chip for data centers under development. It also comes as the embattled Chinese company faces mounting political troubles and additional bans in the U.S. and Europe.
The new CloudEngine 16800 has three features that Kevin Hu, president of Huawei Network Product Line, says are vital to “data center switches in the AI era.” In addition to an embedded AI chip, this includes 48-port 400GE line card per slot and the capability to evolve to an intent-based network.
Data center network performance affects AI computing power, and it’s becoming a bottleneck for AI commercial processes, the vendor says. With traditional Ethernet, the AI computing power of data centers maxes out at 50 percent because of 1 percent packet loss rate. Meanwhile the annual volume of data worldwide is skyrocketing. IDC’s forecast says it will increase from 33 zettabytes (33 billion terabytes) in 2018 to 175 zettabytes in 2025. And existing 100GE data center networks won’t be able handle the data flood.
Additionally, traditional methods won’t be able to meet requirements as the number of data center servers increases, and the computing network, storage network, and data network converge, Huawei says.
Embedded AI Chip
Its new switch tackles these problems with an embedded AI chip that uses the iLossless algorithm to implement auto-sensing and auto-optimization of the traffic model. This translates to lower latency and higher throughput based on zero packet loss. The CloudEngine 16800 overcomes the computing power limitations caused by packet loss on traditional Ethernet, increasing the AI computing power from 50 percent to 100 percent and improving the data storage Input/Output Operations Per Second (IOPS) by 30 percent, the vendor claims.
The switch’s 48-port 400GE line card per slot and 768-port 400GE switching capacity meet the requirements for five-fold traffic growth. And it also reduces power consumption per bit by 50 percent, Huawei says.
And the AI chip improves the intelligence level of devices deployed at the network edge and enables the switch to implement local inference and make decisions in real time. It uses a centralized network analyzer called FabricInsight that the vendor says can identify faults in seconds and automatically locates the faults in minutes. This improves the system’s flexibility and helps advance a self-correcting, self-driving network.
Growing Political Pressure
The new data center switch comes amid growing security concerns about the China-based vendor’s products. It’s already facing an outright ban of its telecom gear in the U.S., and last week the U.S. Commerce Department reportedly said it won’t allow the company to send some of the technologies developed at its Silicon Valley subsidiary back to China.
Also Poland arrested a Huawei employee in that country and charged him with spying for the Chinese government. And Poland may consider barring the use of Huawei products by public agencies, a senior government official told Reuters on Sunday.
Huawei said it “complies with all applicable laws and regulations in the countries where it operates.” And on Saturday the company announced it fired the employee who was arrested in Poland because he “brought Huawei into disrepute.”