SV5270G2-B (Balanced) / SV5270G2-D (Computing Density)
*SV5270G2 servers are installed in SC2000 (12U OCS Chassis)
Power efficient server, optimized for computing-storage balanced applications
Processor | Intel® Xeon® E5-2600 V4 |
---|---|
Memory | Up to 512GB; 8GB/16GB/32GB DDR4 up to 2133MT/s; 16 DIMM slots |
Processor Sockets | Two Sockets |
Memory | Intel® C610 Series |
Storage | Ten 3.5” drive bays: ● SAS/SATA HDDs Eight M.2 SSD 2242/2260/2280 modules (via PCIe riser) |
Expansion Slots | Four PCIe x8 mezzanine slots ● 40/50GbE QSFP port |
Power Supply | Centralized OCS PSU in 12U Chassis |
Dimensions | 1U rack; 43.5(H) x 442.4 (W) x 830.4 (D) (mm) |
Weight | 16kg |
OS Support List | Windows® Server 2012 R2 |
Bandwidth throttling to limit the speed of responses. Support web applications and web services with the compact combination of High-computing and high power efficiency.
The most simple definition is that a Web server runs a website by returning HTML files over an HTTP connection. This definition may have been true in the early days of the Internet, but the line has blurred between websites, Web applications and Web services, etc.
For example, a server that delivers an XML document to another device can be a Web server. A better definition might be that a Web server is any Internet server that responds to HTTP requests to deliver content and services.
1. Virtual hosting to serve many web sites using one IP address
2. Large file support to be able to serve files whose size is greater than 2. 2 GB on 32 bit OS
3. Bandwidth throttling to limit the speed of responses in order to not saturate the network and to be able to serve more clients
4. Server-side scripting to generate dynamic web pages, still keeping web server and website implementations separate from each other
Balance high performance with power consumption to store and process complex data and minimize data movement.
HDFS (Hadoop Distributed File System) stores a large amount of data placed across multiple machines, typically in hundreds and thousands of simultaneously connected nodes, and provides data reliability by replicating each data instance as three different copies - two in one group and one in another. These copies may be replaced, in the event of failure.
The HDFS architecture consists of clusters, each of which is accessed through a single NameNode software tool installed on a separate machine to monitor and manage the file system and user access mechanism for that cluster. The other machines install one instance of DataNode to manage cluster storage.
Fast-growing category of high-value applications that are increasingly employed by business and technical computing users.
These users will select standard servers for handling different Hadoop functions, which they then assemble into a complete Hadoop environment. Cloudera, a Big Data Hadoop provider, certifies hardware specifically for this purpose.
Category | File Title | Release Date | Actions |
---|---|---|---|
Datasheet | Datasheet - SV5270G2 Series | 2017/06/20 | Download |