The Global Database of Data Center Industry Expertise
2009 Gartner study surveyed customer opinions about computer hardware reliability and the need for third-party maintenance (TPM) services. The survey results showed that customers are reconsidering the value of hardware maintenance and in some cases, the need for third party hardware maintenance at all.
More than ever, IT managers must rationalize the value of the hardware maintenance services that they purchase. In the past, many large organizations did not consider third party…
Added by Chris MacKinnon on March 12, 2011 at 14:41 — No Comments
Viruses, spyware, and network threats get most of the attention, but environmental factors like heat, humidity, airflow, smoke, and electricity can be equally devastating to server room equipment, and thus to a company’s IT operations.
To get a sense of the danger, let’s take overheating as an example. Servers generate high levels of heat, and the facility must be kept cool to ensure optimal performance. The warmer it gets, the more likely equipment will overheat and malfunction.…
Added by Chris MacKinnon on March 11, 2011 at 12:13 — No Comments
Over the last few weeks we have been focusing on the major trends facing the data center industry throughout 2011. We have discussed both the importance of data center business efficiencies and the effects of energy efficiency monitoring and regulation. The last area we’d like to take a look at is mitigating and…
ContinueAdded by Chris MacKinnon on March 11, 2011 at 12:11 — No Comments
Why is automation technology useful in today’s enterprise data centers?
Automation technology has emerged as an enabling technology for organizations looking to provide a cost-effective and tangible way to expedite the processing of their IT and business assets. Today’s IT environment consists of multiple servers, both physical and virtual, platforms and a wide range of applications. Organizations continue to place high value in the ability to reduce costs and risk through…
Added by Chris MacKinnon on March 10, 2011 at 12:16 — No Comments
Why is high speed inline compression and data deduplication useful in today's enterprise data centers?
Today’s enterprise data centers are experiencing exponential data growth with limited budgets and staff. The cost of storage has become a major component of IT budgets, often growing faster and less predictably than other areas.
Inline compression and data deduplication allow data to be stored using less physical hardware than would otherwise be required. Depending…
Added by Chris MacKinnon on March 9, 2011 at 15:26 — No Comments
Survey results released at GigaOM Structure in 2010 from Zeus Technology revealed that 82 percent of U.S. organizations struggle to manage their website and services performance, stemming largely from growing IT complexities like virtualization, cloud computing and multi-site data center management. The survey also found that 79 percent of respondents lack resources to manage web applications across complex multi-data center environments, yet 95 percent of senior IT professionals surveyed…
ContinueAdded by Chris MacKinnon on March 8, 2011 at 14:57 — No Comments
If you have a data center, you can do two things that are related to the cloud. First, you can connect your existing infrastructure in the data center to public cloud storage services. For example, backup your SQL Server to Amazon S3 or Rackspace Cloud Files. If you happen to be in the same data center with a public cloud storage service provider, such as INTERNAP, you can connect to that provider too and enjoy the fast pipe within the same data center. Second, you can construct your own…
ContinueAdded by Chris MacKinnon on March 7, 2011 at 16:27 — No Comments
If you have a data center, you can do two things that are related to the cloud. First, you can connect your existing infrastructure in the data center to public cloud storage services. For example, backup your SQL Server to Amazon S3 or Rackspace Cloud Files. If you happen to be in the same data center with a public cloud storage service provider, such as INTERNAP, you can connect to that provider too and enjoy the fast pipe within the same data center. Second, you can construct your own…
ContinueAdded by Chris MacKinnon on March 7, 2011 at 16:26 — No Comments
Cloud computing is an over-bloated concept, where many different technologies and software are squeezed in. Our solution focuses on the Infrastructure as a Service layer, where virtualization is playing a crucial role in the evolution of traditional data centers. In short, our solution is the key to the door of the next paradigm that will rule the running of tomorrow's data center.
…
ContinueAdded by Chris MacKinnon on March 7, 2011 at 2:15 — No Comments
Ethernet keeps evolving. A number of technology evangelists recently gathered together at the Ethernet Technology Summit in Santa Clara, California. They discussed nothing but Ethernet for three days. The program covered current and future Ethernet technologies, such as 40/100GbE, FCoE, 25Gbps signaling, Terabit Ethernet, virtualization, cloud computing and Ethernet in the data center.
…
ContinueAdded by Chris MacKinnon on March 7, 2011 at 2:10 — No Comments
Ethernet keeps evolving. A number of technology evangelists recently gathered together at the Ethernet Technology Summit in Santa Clara, California. They discussed nothing but Ethernet for three days. The program covered current and future Ethernet technologies, such as 40/100GbE, FCoE, 25Gbps signaling, Terabit Ethernet, virtualization, cloud computing and Ethernet in the data center.
…
ContinueAdded by Chris MacKinnon on March 7, 2011 at 2:07 — No Comments
These days good data center DESIGN includes attention, as a corporate priority, to efficiency of space, critical support systems and IT equipment as well as the monitoring and data collection of the operations. Then together with consideration of design best practices, traditionally including provisioning a scalable and secure facility envelope (e.g. raised floor area, support areas, clear height, floor load and secure access),a floor plan configuration with proper room adjacencies, hot and…
ContinueAdded by Chris MacKinnon on March 4, 2011 at 15:33 — No Comments
CPU-intensive or large-memory workloads that require high performance computing exist in almost every enterprise that has an analytics-driven or data-warehousing environment. vSMP Foundation is the core ScaleMP solution for solving three fundamental problems facing high performance IT today:
Workloads with high CPU count or large memory needs
Simplified cluster installation and management
Scale up cloud resources for high performance or large memory…
ContinueAdded by Chris MacKinnon on March 4, 2011 at 15:32 — No Comments
The question of how enterprise end users can access the applications running in the data center, and do so cost effectively and reliably, given the ever increasing bandwidth demands that many of today's media-rich applications have is one of the biggest bottlenecks IT managers face. MPLS is a reliable but very expensive way to access enterprise data centers. The price per bit for Frame Relay and MPLS service has barely come down over the years, and today seems to be coming down 10% - 15% per…
ContinueAdded by Chris MacKinnon on March 3, 2011 at 14:58 — No Comments
Here at Park Place Technologies, we’ve seen and heard it all. Having worked with over 900 customers, we’ve heard many false perceptions of third party maintenance providers. In fact, these are very common misperceptions that could possibly lead to making the wrong choice for your hardware support. So, here are 4 third party maintenance myths and what you really need to know.
…
ContinueAdded by Chris MacKinnon on March 2, 2011 at 12:16 — No Comments
Predictive analytics has been around for a while and is applicable in several forms, most notably in business intelligence or BI where you see it applied in areas such as web trends and consumer data analysis. What we're talking about is predictive analytics “for IT.” Predictive analytics for IT is about understanding vast amounts of real-time data to forecast IT performance issues before they affect customers and users. Our predictive analytics software is powered by what Gartner calls…
ContinueAdded by Chris MacKinnon on March 2, 2011 at 12:15 — No Comments
Many of today’s enterprise data centers employ virtualization technology to more efficiently use computing resources. Virtualization however introduces new challenges for data centers: In a virtual environment, all resources must be shared. This creates a complex resource allocation problem that if not solved, causes performance bottlenecks, poor ROI, unhappy customers and will only worsen with scale.
…
ContinueAdded by Chris MacKinnon on March 2, 2011 at 12:14 — No Comments
Many of today’s enterprise data centers employ virtualization technology to more efficiently use computing resources. Virtualization however introduces new challenges for data centers: In a virtual environment, all resources must be shared. This creates a complex resource allocation problem that if not solved, causes performance bottlenecks, poor ROI, unhappy customers and will only worsen with scale.
…
ContinueAdded by Chris MacKinnon on March 2, 2011 at 12:13 — No Comments
Added by DCPNet Admin on March 1, 2011 at 12:09 — No Comments
Added by DCPNet Admin on March 1, 2011 at 12:08 — No Comments
2022
2021
2020
2019
2018
2017
2016
2015
2014
2013
2012
2011
2010
2009
2008
Welcome to
The Data Center Professionals Network
Connecting data center industry professionals worldwide. Free membership for eligible professionals.
© 2025 Created by DCPNet Admin.
Powered by