California, Oct 16, 2023 - Wiwynn (TWSE:6669), a leading cloud IT infrastructure provider for hyperscale data centers, is set to showcase its purpose-built AI acceleration server building blocks as part of the Open Compute Project (OCP) Global Summit 2023 on October 17-19, 2023.
Generative AI has demonstrated its profound impact on various aspects of human life and the work environment. The emergence of large-scale language models (LLMs) has led to a surge in demand for computing acceleration with diversity and sustainability for different types of AI model training, customization, and inference from the cloud to the edge. In response to this escalating demand and the vast market opportunities, Wiwynn is committed to developing AI server building blocks, rooted in the OCP open AI platform, tailored to different needs to achieve the best total cost of ownership (TCO).
For AI training, Wiwynn’s Gemini, built upon the OCP Grand Teton, adopts a modular approach, accommodating a diverse range of CPU trays, accelerator trays, and bridge trays (including switch, NIC, SSD, memory) to meet various accelerating scenarios. It will support cutting-edge solutions from prominent technology leaders, including NVIDIA, Intel, Intel Habana, AMD, and more. Additionally, by integrating with Wiwynn’s cold-plate liquid cooling solution, the Gemini platform not only fulfills the formidable computing power requirements demanded by AI applications but also optimizes power performance, thereby ensuring data center sustainability. This translates into an extensive array of choices and TCO optimization benefits for data centers in the AI era.
For edge AI applications, Wiwynn® ES200G2 stands as an openEdge server, supporting two dual-width FHFL PCIe Gen5 x16 PCIe cards, such as the NVIDIA L40S GPU. Its compact, short-depth form factor fits seamlessly into diverse edge locations while offering flexible support for all-in-one private 5G CU/DU functionality, edge AI lightweight training, and AI inference. It is the ideal solution for edge-centric applications such as smart manufacturing, autonomous vehicles, and CDNs, where AI support for real-time, low-latency applications is paramount.
The AI market is flourishing, teeming with boundless potential. The manifold application services necessitate diverse hardware architectures for accelerating computing, driving the pursuit of cost and performance optimization. Wiwynn is steadfast in its commitment to providing a comprehensive set of building blocks imbued with a modular design concept, facilitating the efficient development, deployment, and implementation of AI applications, be it in the cloud or at the edge.
We cordially invite you to visit Wiwynn at Booth B3 and embark on a journey to explore the future of AI together.
About Wiwynn
Wiwynn is an innovative cloud IT infrastructure provider of high-quality computing and storage products, plus rack solutions for leading data centers. We are committed to the vision of “unleash the power of digitalization; ignite the innovation of sustainability”. The Company aggressively invests in next-generation technologies to provide the best TCO (Total Cost of Ownership), workload and energy-optimized IT solutions from cloud to edge.
Get more information on Wiwynn’s Facebook, LinkedIn, and website.