What's News

Find latest industrial events, technology news and our breakthrough products.
Stay up to date with what's happening in the industry.


  • How much does it cost to build a data center?

    As the world continues to turn digital, the data center industry continues to grow and many data centers are being built.

    However, building a data center can be very costly.

    The initial investment can range depending on the size and scope of the project. Estimates put data center build costs at between $600 to $1,000 per square foot or $7 million to $12 million per megawatt of IT load.

    Despite the high upfront and ongoing costs, businesses continue to invest in data centers due to the many benefits they offer.

    So let's get into how much it costs to build a data center, including building, equipping the facility with power and cooling infrastructure, as well as some more affordable options to start from scratch.

     

    In the article:

    The data center building

    First - the costs of the building.

    Data centers come in various shapes and sizes, from small facilities that occupy a few thousand square feet to multi-story buildings that span hundreds of thousands of square feet.

    The cost of building a data center depends heavily on this size and features.

    Size

    Smaller data centers can be built for around $10 million, while larger ones can cost upwards of $2 billion. The most expensive data center ever built is said to be the new Apple Campus 2, which cost an estimated $5 billion.

    A data center should not be built too big or too small.

    It is good to start with the minimum required size for cost savings. However, data centers should be scalable to handle the ever-growing needs of businesses.

    Scalable data centers

    A scalable data center space can easily accommodate an increase in the number of servers and other hardware required to keep things running smoothly. This can be designed from the beginning to allow for future growth, or it can be retrofitted later on if needed. This may add some extra upfront costs but result in savings in the long run.

    Location

    Location also greatly affects the cost of building a data center.

    For example, construction costs in Silicon Valley are much higher than in other parts of the country. A data center location must have great access to power and network connections which may drive up costs.

    The cost of real estate can be a major factor in deciding where to build a data center. In some cases, businesses may need to lease or purchase property specifically for their data center needs. This can be expensive, but there are some tax incentives available for data center development.

     

    What infrastructure is needed when building data centers?

    In addition to the cost of the building itself, businesses must also consider the cost of equipping the data center with power  and cooling infrastructure.

    All of this equipment is necessary to keep servers and other hardware running smoothly, so it's important to budget for it when planning a data center buildout to get the greatest cost saving benefits.

    Electrical systems

    The electrical systems, or network infrastructure are estimated to be around  40-45% of the total cost of a data center.

    These systems include:

    • Electrical backup generator
    • Batteries
    • Power distribution unit (PDU)
    • Uninterruptible power supply (UPS)
    • Switchgear/transformers

    Cooling infrastructure

    Cooling infrastructure roughly comprises 15-20% of the total cost of a typical data center.

    This includes

    • Air conditioning
    • Computer room air handler
    • Air cooled chillers
    • Chilled water storage
    • Pipes

     

    The data center tiers and their cost

    In terms of cost, another area that influences the cost is the tier. There are four tiers of data centers that are defined by the Uptime Institute.

    Tier I data center

    This is the most basic level and is suitable for small businesses or departments. A tier I data center has only one path for power and cooling, and likely no backup or redundant components. This is the cheapest option but has the major downside of not being able to withstand a power outage.

    Tier II data center/H4>

    This data center tier has most of the features of Tier I, including only one path for power and cooling but also some redundant components, including This ensures that the data center can continue to operate in the event of a cooling failure. Estimates put the cost of these data centers at about $4.5-6.5 million per megawatt.

    Tier III data center

    This is the most common data center tier. A Tier III data center must meet all of the requirements of a Tier II center, but must also have redundant networking infrastructure. This ensures that the data center can continue to operate in the event of a network failure. These cost around double a Tier II data center.

    Tier IV data center

    The highest level of certification - Tier IV data centers are designed for critical operations such as healthcare. These centers must have fully redundant everything, including power, cooling, networking and security systems. They are also subject to more rigorous testing and certification requirements and cost the most to build!

    Businesses should choose the tier that best meets their needs. Depending on their size and requirements this will greatly influence the cost.

     

    Alternatives to building a data center

    While tech giants and other huge companies might consider their own private data center, an enterprise data center (a data center constructed and used by a single organization) an on-premises data center (located on the premises as the name suggests), or a single tenant data centers (a data center dedicated to a single client or company), there are more affordable options.

    Colocation data center

    A colocation data center is a great option for businesses that don't have the resources or need for their own data center. This type of data center is housed in a third-party facility and businesses can lease space and equipment there

    This is a very affordable option, as businesses only need to pay for the services they use with a shared cost structure. Colocation data centers also offer flexibility, as businesses can add or reduce their space and services as needed

    Cloud data center

    Cloud data centers are another great alternative to on-premises or enterprise data centers. They offer all of the benefits of a traditional data center, but without the cost or hassle of building and maintaining your infrastructure.

    Cloud data centers are housed in secure, climate-controlled facilities, and are connected to high-speed networks for fast, reliable access. They offer a wide range of services, including cloud hosting, cloud storage, and cloud computing.

    Businesses can save money and time by choosing a cloud data center instead of building their own. Cloud data centers are scalable, so businesses can grow their infrastructure as needed. And they offer flexibility and convenience, allowing businesses to access their data from anywhere at any time.

     

    Cost to build a data center - in summary

    The total cost to build a new data center can vary greatly, depending on the size and complexity of the project and range from tens of millions to even billions. Major influencing factors include the location, size and infrastructure you choose.

    There are also many alternatives to building a data center, including colocation and cloud data centers which offer businesses flexibility and convenience while saving money.

    For more about data centers head to our info center where we cover all there is to know about data centers from airflow management to reducing energy consumption.


  • Everything you need to know about data center power

    Data center  power is one of the most important aspects of keeping a data center up and running.

    Without adequate power, a data center can grind to a halt very quickly.

    There are many different factors to consider when planning data center power, from the type of power source to the capacity and redundancy needs.

    In this article, we'll cover the entire system of data center power so you can know everything you need to.

     

    In the article:

    The power source

    The first thing to consider when planning data center power is the power source.

    There are two types of power sources that are most common in today's data centers:

    AC power

    AC power is a voltage generating current which uses alternating electricity. Current in an AC generally operates at 12 - 24 volts. It is considered "traditionalist" electricity and is used in most data centers worldwide. It's a reliable and stable source of power and is easily changed from voltage to voltage.

    DC power

    DC power is used less commonly. Contrary to AC, this remains constant & does not change directions. This type is typically associated with battery energy stored in a UPS battery but mot the powering of ancy data center components..

    Where is power supplied from?

    Most data center systems will use municipal electric grids as main sources of power. The electric grid will provide energy and then transform the electricity onsite to get the best possible voltage and current for the needs of a central data center.

    Some data centers look at other sources of electricity as a substitute for the electric grid. Data centers usually carry their own generators to provide emergency support. Some generator systems require a data center power supply. The electricity has not already been supplied.

     

    Data center power design

    Power capacity

    Once you've chosen a type of power source, you need to determine the power capacity needed for your data center. This refers to how much power is the maximum that can be delivered to a data center at any given time. It's important to have a high enough power capacity to meet the needs of your data center, but you also need to make sure that the capacity isn't too high or it will be unecessarily expensive.

    Power usage effectiveness

    Then you need to consider the power usage effectiveness or power usage efficiency (PUE) . This is a metric used to measure the efficiency of a data center. PUE is calculated by dividing the total amount of power used by the data center by the amount of power used by the IT equipment.

    A lower PUE indicates a more efficient data center. There are many things that can affect PUE, from the type of equipment to the way the data center is designed. There are also many ways to improve PUE, such as using energy-efficient equipment and optimizing airflow in the data center.

    Improving PUE is critical for many data centers that are struggling to keep up with increasing power demands. By using strategies to optimize power usage, data centers can not only improve their PUE but also save money on their energy bills.

    Power redundancy

    Data center power redundancy is essential for ensuring that your data center stays up and running. If something like a power outage occurs, it can cause serious damage to the equipment and result in data loss. That's why it's important to have a backup power source, also known as an uninterrupted power supply (UPS).

    There are two types of power redundancy:

    • N+1 redundancy: This type of redundancy means that you have one more power supply than you need. This provides a level of protection in case of a failure of one of the power supplies.
    • N+2 redundancy: This type of redundancy means that you have two more power supplies than you need. This provides a higher level of protection in case of a failure of one of the power supplies.

    Either type of redundancy is important for ensuring that your data center remains up and running in case of an outage. These provide energy from the utility power grid to your generators. By using a redundant power supply, you can rest assured knowing that your data center will be able to continue operations.

     

    Other data center power best practises

    Management software

    Data center power monitoring and management software is essential for ensuring that your data center power design is efficient. By using software to monitor power usage, you can identify any areas where you could optimize power consumption and improve your PUE.

    Software also allows you to track and manage your generator systems, making sure that they are always up and running when needed. This is important for ensuring that your data center has a continuous source of power in the event of an outage.

    Sustainability

    Sustainability is a critical consideration for data centers. Not only is it important to be environmentally friendly, but it's also important to be economical. By using strategies to make your data center more sustainable, you can save money on your energy bills.

    There are many ways to make your data center more sustainable. One of the simplest ways is to use energy-efficient equipment. You can also improve the efficiency of your data center by optimizing airflow.

    Breaker coordination

    Breakers are another essential part of data center power systems. They are used to protect the equipment in the data center from damage caused by electrical faults.

    There are two types of breakers:

    • Molded case breakers: These breakers are used to protect circuits from overloads and short circuits.
    • Air circuit breakers: These breakers are used to protect equipment from damage caused by lightning strikes or other surges in power.

    Scalability

    A data center's scalability is determined by its ability to handle increasing or decreasing amounts of data. A data center that is scalable can handle changes in demand without having to make any major changes to the infrastructure. This is important for businesses that are constantly growing and expanding.

    There are many ways to make a data center more scalable. One of the simplest ways is to use modular data center equipment. This allows you to add or remove modules as needed, without having to make any major changes to the data center. You can also use cloud-based solutions to increase scalability. This allows you to add more resources as needed, without having to make any physical changes to the data center.

     

    Power in data centers

    Data centers are essential for businesses today. Not only do they allow you to store your data in a secure location, but they also provide a reliable source of power for your equipment. By using a data center, you can be sure that your business will always have access to the information it needs.

    In order to ensure that your data center runs smoothly, it's important to understand the different aspects of data center power. We've discussed some of the most important concepts, including power source, measures of power efficiency and breaker coordination.  By understanding these concepts, you can make sure that your data center is safe and efficient.

    For more information on data centers, head to our info hub where you can learn more about having an energy efficient, green data  center, or learn how to set up your own data center.


  • Is a green data center good for business?

    Data centers became a more prominent part of our lives, and the economy, as Internet usage transitioned from a rarity to the norm.

    As this occurred, data center energy consumption grew exponentially year after year, continuing to do so, along with their carbon footprint.

    Simultaneously, sustainability began evolving into an essential component of our global economy.

    As a result, to reduce their environmental impact, many data center operators have become green, using green or renewable energy sources.

    But is this really enough?

    Is it worth it for businesses to make the switch to green data centers?

    The short answer is yes – but there are a number of reasons why businesses should consider making the switch to green data centers.

    In this article, we will go into why green data center solutions are good for business.

     

    In the article:

    What are green data centers?

    You may be wondering, what exactly does it mean to be a green data center?

    A green data center is technically a normal data center, however, it has been designed and built to minimize its environmental impact, which includes: reducing energy consumption, using renewable energy, and minimizing the use of harmful materials.

    What are renewable energy sources?

    Renewable energy sources are alternative energy solutions that can be replaced or replenished.

    They have the potential to reduce carbon emissions by providing significant amounts of electricity without emitting greenhouse gasses (GHGs), which contribute to climate change.

    The most common renewable energies are those generated from:

    • Sunlight (solar energy)
    • Winds (wind turbines)
    • Movement of water (hydroelectricity)
    • Biomass or biofuels (fuels made up of organic substances)
    • Geothermal activity.

    In contrast, brown energy is what is used by data centers that aren’t green. These are from fossil fuels that emit carbon.

    There are also technologies such as nuclear power that are generally called zero carbon but can carry environmental issues associated with the disposal of radioactive waste.

     

    The benefits for your business of green data centers

    Green data centers have many advantages for businesses. These include:

    Energy savings

    Saving energy is one of the primary reasons to make the switch to greener data center solutions.

    A green data center can have up to 40% more energy efficiency compared to a traditional data center.

    This is due to a number of factors, such as the use of renewable energy sources and more efficient cooling systems.

    In addition, by making use of IT equipment that is more energy efficient, businesses can further reduce their energy usage.

    A parameter to measure the efficiency of energy usage is power usage efficiency (PUE) .

    Environmental sustainability

    A green data center is also much more environmentally sustainable than a traditional data center. This is due to the fact that they use sustainable or renewable energy sources which release fewer pollutants into the air.

    Reducing the amount of pollution is not only less damaging for the environment but also less harmful to people’s health.

    Additionally, green data centers are often built with sustainable materials that can be reused or recycled, reducing the amount of waste produced by the data center, and helping to minimize its carbon footprint.

    By making the switch to green data centers, businesses can help reduce the amount of greenhouse gasses emitted into the atmosphere.

    Data centers are one of the biggest sources of pollution in the world, and so anything that can be done to reduce their impact is a good thing.

    Reduced costs

    Green data center technologies reduce the power consumption and the costs for businesses with the latest technologies. The primary way this is achieved is by using less energy than traditional data centers, which can lead to significant savings on energy bills.

    Many green technologies are becoming more affordable as they become more popular, and, by using these technologies, businesses can save money while also helping the environment.

    Improved brand image and customer loyalty

    Apart from the benefits listed above, making the switch can also improve a company’s brand image and customer loyalty.

    It shows that a business is committed to sustainability and cares about the environment, which can help to build positive associations with the brand and increase customer loyalty.

    Consumer awareness is increasing regarding environmental concerns.

    Customer’s consider company values equally relevant to their responsibilities in business operations. According to recent analysis, nearly 80% of Americans want to know how renewable energy works and how it is used.

    Using green data center technologies can make a business look cutting edge and innovative, which can also attract new customers and help to set a company apart from its competitors.

     

    Are there any challenges to going green

    While there are many benefits to using green data center technologies, there are also some challenges that businesses need to be aware of.

    The first challenge is that making the switch can be expensive. This is because businesses need to update their IT equipment and install new technologies. They may also need to make changes to their buildings and infrastructure.

    Another challenge is that not all green data center technologies are created equal. Some are more expensive than others, and not all of them are as efficient as they claim to be.

    Businesses need to do their research before making the switch to ensure that they’re getting the best deal possible.

    Occasionally it can be difficult to find qualified professionals who are knowledgeable about green data center technologies. This can lead to delays in implementing these technologies and increased costs.

    In order to overcome these challenges, businesses should:

    • Make a detailed plan for transitioning to green data centers
    • Invest in high-quality, energy-efficient IT equipment
    • Work with qualified professionals who understand green data center technologies

     

    Going green is worth it for your business

    Green data centers are becoming more and more popular as businesses become more aware of the benefits they offer.

    There are many reasons to make the switch, including reduced costs, improved brand image and customer loyalty, and reduced environmental impact.

    While there are some challenges associated with making the switch to green data centers, these can be overcome with a little planning and effort.

    Overall, if you do it properly, having a green data center is great for business.

    For more information about data centers head to our info hub, or check out everything you need to know about data center power.


  • How airflow works inside of a server

    When it comes to servers, there are a few things you may already know, such as what a server is, where it should be kept (in a server room) and how it works - basically, a server needs airflow in order to keep it running.

    But did you know how the airflow actually works?

    Servers are a critical part of any business, and it’s important to understand them to keep things running smoothly.

    In this article, we'll take a look at how airflow works inside of a server. We will discuss why it is important, what you can do to optimize it, as well as touching on airflow in server racks, server rooms and data centers.

     

    In the article:

    How does airflow in a server work?

    Server airflow works primarily through internal fans that create air pressure.

    Internal motherboards and other parts within the server can pose a block to airflow but the fans are usually able to overcome this.

    The pressure pushes the hot air out of the server and draws in cool air from outside.

    By keeping the hot air moving out of the server, we can keep it running cooler and more efficiently.

    Why is airflow important?

    Airflow in a server is important for two reasons:

    1. It helps keep the server cool
    2. It helps to distribute the heat evenly.

    As electronic components generate heat when they are in use, it is important for servers to have good airflow so that the hot air can be dissipated and the cold air can circulate.

    When air doesn't move, warm air rises and cold air falls. This can cause the hot air to become trapped at the top of the server, which can lead to overheating and cause server failure.

     

    Airflow in server chassis and server racks

    Servers need to be arranged in external structures to allow for proper airflow.

    This is where server chassis and server racks come in.

    In most cases, one server chassis is fit into a server rack and individual servers are put into the server chassis.

    However, these arrangements can pose additional barriers for the internal server fans to overcome and if this additional resistance is too high the fans may not be able to overcome them.

    To combat this, a server chassis cooling design that is commonly used is a cold-air-inlet.

    It is an intake fan that brings in cold air from outside the server chassis, ensuring the server manages heat by keeping the internal components of the server cool in addition to the internal fan.

    In server racks, there are two types of airflow:

    Front-to-back

    A front-to back, or front-to-rear rack, is the most common type of airflow in server racks. In this type, cooled air enters through the front of the server and the hot air is expelled out the back.

    Back-to-front

    In a back-to-front configuration, the cold air enters through the back of the server and the hot air is expelled out the front. This type of airflow is not as common, but can be useful in some situations.

     

    What about airflow in server rooms or data centers?

    The same principles of airflow that apply to individual servers, or server chassis in server racks, can also be applied to a server room  or data center.

    These bigger rooms, which often contain many servers, have additional mechanisms for airflow optimization, including a layout system of hot and cold aisles, which is based on the idea of keeping cold and warm air separate in contained aisles.

    Powerful air conditioning units are also used to keep cool air circulating where it is desired.

     

    Optimizing server airflow

    Optimizing airflow in a server can be a challenge, but with little effort, you can help ensure that your server runs safely and efficiently.

    Here are our five best tips for this:

    1. Understand how your server is designed

    To ensure that airflow is optimal, the first step is understanding the design of your server.

    Some servers have multiple fans, while others rely on a single fan to move air around.

    You need to be aware of the location of the intake and exhaust ports, and make sure that they are positioned so that the air can flow freely.

    2. Check the fans

    With the fans there are two main things to keep in mind:

    1. That there are enough fans installed in your unit
    2. That all of the fans are working properly

    3. Keep the area free of obstructions

    It's important to keep the area around the server clean and free of obstructions.

    Dust and other debris can clog up the vents and disrupt the airflow.

    Make sure that there is enough space between the server and other objects nearby (such as the cabinets or walls).

    4. Develop a thermal management plan

    From here, you can develop a server thermal management plan, a critical step in managing heat and keeping your servers running smoothly.

    It allows you to regulate the amount of heat that is dissipated from the server's components.

    5. Use blanking panels

    Blanking panels are a great way to improve the airflow inside of your server.

    They are simple, rectangular panels that fit into the empty spaces of the server rack.

    This helps to create a more consistent airflow, which can be helpful in preventing overheating.

     

    Understanding airflow in servers

    It is important to understand the principles of airflow in servers because it is key to keeping components cool and running smoothly.

    If air doesn't move, warm air rises and cold air falls, which can cause the hot air to become trapped at the top of the server.

    There are a number of ways to optimize airflow in a server by understanding how your server is designed, checking the fans, keeping the area free of obstructions, and developing a thermal management plan.

    Find out more about how to manage the cables in your server racks or products to help you manage airflow in server rooms and data centers.

  • Albert Einstein once said that knowledge is defined as a practical understanding of a subject, the same apply to operating and upkeeping a data centre. Center of Expertise for Energy Efficiency in Data Centres (CoE) at Lawrence Berkeley National Laboratory (LBLN) led by U.S. Department of Energy provides technical support, tools and technologies intended to optimise and reduce energy use in data centre. This is where novices and grizzled veterans seek options for maximising new construction or retrofit results.

    Knowing what you don't know.

    CoE has designed a data centre profiling tool - DC Pro to help data centre operators diagnosing how energy is being used and determining ways to save energy. Combining with management information such as energy use and environmental sensing data, DC Pro can provide data centre operators estimated PUE and potential energy saving through its web based, self -guided menu driven interface. Essentially, instead of telling you what you already know, it tells you what you don't know or yet to know.

    What improvement can we expect?

    We tried it with one of our candidate data centre with the PUE of 1.5, DC Pro estimates that there is a potential of improving it to a PUE of 1.2, a 0.3 point improvement. It looks into energy use distribution and referencing its comprehensive set of databases and algorithms and derives key action plans that will impact energy use significantly. For example, part of the report produced by DC Pro after analysing our candidate site shows that the key issues in the candidate site are cable congestion in raised floor plenums and the lack of air flow management and aisle containment. It is estimated that if we implement proper air flow management, hot/cold aisle containment and cable management, we could reduce the portion of cooling energy use in relation to total site energy use by up to 21% and increasing IT power utilisation by 20%. Higher proportion of IT energy use means that we are spending on what drives the business instead of compensating for unnecessary losses.

    Keeping it cool and know your power.

    With the help of tools like DC Pro, informed management is no longer something that only specialised energy consultant could do. Data centre operator's goal should be keeping its facility cool and at the same time know where the energy costs are attributed to. Implementing air flow management and containment systems can be very rewarding in long run and knowing your power will also go a long way.

Menu