Tuesday, 28 July 2009

Broadband Bashing

ISP's (Internet Service Providers) are in trouble again after the release of Ofcom's latest report. There has been extended debate about the legalities and honesty of ISP's 'up to' speed clause in their bandwidth package, because the majority of customers do not and cannot attain the speeds suggested by them.

The Ofcom report focused on the 8Mb service provided by ISP's across the country. Their findings revealed that only 9% of users who subscribe to an 8mb service could attain above 6Mb and that the majority of users could get less than half the quoted amount. The ISP's argue that customers within a close proximity to the telephone exchange attain close to the quoted bandwidth speeds and because of distance and drain on the service by other users, customers who are further out get a slower service. Ofcom also revealed that the majority of 8mb connections could not even support 8Mb, and actually only had the capacity for 7.1Mb.

While the rest of the country has to abide by advertising laws and provide the customer exactly what they offer, ISP's have the ability to suggest that you will receive a service that they cannot provide. The two words 'up to' are an unfair loophole for consumers considering that the majority of customers are not getting the advertised amount. The issue that Ofcom is attempting to solve is to allow the public to get broadband connections that are consistent and that can be regulated. If companies are willing to install the infrastructure to support fast internet speeds such as ADSL MAX or fibre, then they should be allowed to advertise its availability - but not before the service is available.

So what does this report hope to achieve? The government wants the ISP's to be honest about the service they provide so that they can reach their target of the whole country having a minimum of 2Mb internet speeds. If the report has the desired effect, ISPs will have to improve their infrastructure and improve their service, which allows the public to receive what they pay for and not a substandard service.

The result of increasing internet speeds across the country will be a larger demand for Data Centre space, as people access websites, store information and extend their dependency on online services. The competition will increase speeds and drive down prices which will make Data Centre space more affordable and desirable. Migration Solutions are data centre specialists who focus on the operation, migration and design of new and
existing data centres. They are vendor independent which allows the best products to be selected for their clients. To find out about what Migration Solutions can do for you visit the website www.migrationsolutions.com

Friday, 24 July 2009

Is a wireless Data Centre possible?

Does your Data Centre look like this?

The majority of Data Centres and Computer Rooms in the UK have disorganised cabling and patching which inhibits air flow and reduces efficiency. The organic nature of a network infrastructure often starts with the best intentions and neat cabling, but develops into a sprawling mess when 'quick patches' are made and then left in position. Five years down the line and you can't see the rear of your servers!

There are so many benefits to a neatly patched facility. Connecting your servers through a patch panel allows quick organised alterations to the network while maintaining sub-floor cabling and air flows through the Data Centre. Using cable management arms inside cabinets keeps all cables neatly together and easily traceable, and allows hot air to exit quickly and unobstructed from the servers, which reduces the need for the computer fans to work as hard, reducing power usage. But could we do away with cabling all together?

Companies like WiTricity are already researching safe, wireless power transfers with the idea that in the future, you will be able to walk into a room or your office and phones, laptops and and mp3 players will start charging automatically without being connected to anything. The potential for this technology is unlimited, giving true freedom for you and your gadgets. But what about Data Centres and Computer Rooms? If the same technology was applied, power cables could be removed which would improve air flows and therefore cooling efficiency. The question remains though, would the amount of power required be able to be safely transferred to the equipment reliably? And, more importantly, being only around 40% efficient vast improvements will have to be made to make it a viable solution in a Data Centre environment.

So if wireless power is a possibility, what about wireless networking? Cisco are currently operating and improving a wireless Data Centre. Their facility uses Power over Ethernet and Wi-Fi Ethernet to create a network cable free Data Centre. They have up to 27 routers in the ceiling per meter to give excellent redundancy and bandwidth levels. This technology, along with wireless monitoring and management like the service provided by Synapsense would create a true, wireless Data Centre. Synapsense gives an on-demand picture of temperatures, humidities and pressures within a facility which could replace an expensive and often in accurate CFD (Computational Fluid Dynamics) test.

While CFD has its place, it lacks the ability to map each individual facilities server utilisation. This new method would allow a Data Centre Manager to fine tune a facility so that servers are always in the most efficient area. With the ability to move servers and racks around freely to optimise cooling, with the only constraint being floor grill placement, super efficient data centres could be just around the corner and they could, just possibly be wireless!

Migration Solutions is a member of The Green Grid which is focused on advancing efficiency in Data Centres. They have also recently received Information Age's award for Best Data Centre Innovation for ERA - an Environmental Report and Audit which aims to help data centre owners to save money and the environment with no or little financial outlay.

Swine Flu

The world has gone swine flu crazy! The public has become paranoid about contracting swine flu and doctors have been inundated with appointments from people who think they have the symptoms. The NHS has just launched a telephone and online service to handle the huge number of enquiries and to provide Tamiflu without the need for a visit to the doctor, where the infection has potential to spread rapidly. Record numbers of people have been phoning the service and accessing the online website, www.pandemicflu.direct.gov.uk.

In the first hour they received 10,000 phone calls and 9.3 million website hits!

NHS direct normally run at around 110-140 GB of bandwidth a month with people looking for information on their website. With their dedicated swine flu website coming online, in the first 3 hours they used 1,404 GB! The new website does use a less bandwidth-heavy design, but even taking that into consideration, they had a rise of 2,453% in bandwidth used! If you compare their normal bandwidth figure (110-140GB a month for 2006-2007) with that of 2003-2004 (25GB) you can see how much of an increase in the reliance on the internet there has been to help find information and diagnose problems.

Is the internet becoming a core utility? The majority of the country use the internet daily to find out everything from directions to talking to relatives abroad. We rely on it as our library, dictionary and news source. When did 'Google it' become an acceptable business term!? Lord Carter's Digital Britain report has promoted the need to give everyone in the UK access to broadband internet by 2015. When you move house, ensuring that you have internet connectivity has become as essential as organising your electricity, water and gas suppliers. If the internet is becoming a core utility, should it be treated as such? Should each house have a bandwidth meter to measure 'how much' internet you use? That way, you could could pay for what you use and not what you think you will use; better value for money for the consumer.

Using the same methodology in Data Centres would reduce the amount of website downtime that is linked to company's exceeding designated bandwidth limits. Companies wouldn't have to second guess the amount of bandwidth they might use which would make Data Centres more competitively priced. There would be a larger focus on the cost of power and subsequently the efficiency of each Data Centre. With more efficient Data Centres, less power would be used and there would be less CO2 being released so the Data Centre operator, customer and the environment would benefit!

Migration Solutions created a Environmental Report and Assessment to look at how a Data Centre operates and areas that could be improved within the facility and operations. The ERA report comes with an entire section on changes that can be made to help your Data Centre run more efficiently and effectively. These improvements range from free, low man-time changes that can yield a 5-10% reduction in power costs to major works which include changing a facility layout, rack layout and replacing old plant equipment. For more information on the Environmental Report and Assessment visit www.migrationsolutions.com.

Wednesday, 22 July 2009

Power to the People

Have you got power? Do you know where your power comes from? How long can you survive without power? Many companies in Kent are asking these questions as they face a third day without electricity. Up to 100,000 homes and businesses have been left without power after a suspected vandal attack on an electricity substation in Dartford. Many businesses have had to close as they cannot function without power and only facilities with backup power generation have been able to continue their operations as usual.

For those who need to keep their businesses going, there has been a huge demand for generators and fuel. EDF who own the substation have been trying to restore power and have been providing backup power in 3 hour long, rotated sessions where possible. A number of Data Centres in the area who do not have generators on site have been paying higher than usual prices to source power at short notice. They are starting to realise how fragile the national grid is in the UK and how there is an increasing need for standby generators with present and expected, future power problems.

So do you know where your Data Centre gets its power? Do you have diverse feeds to maintain resilience? In a worst case scenario, how long could your facility survive without mains power and without generator fuel? For the majority of Data Centres with standby generators, no more than 24 hours worth of fuel is kept on site. According to the Uptime Institutes tiering standards, a tier 4 facility is only required to have 4 days of fuel while a Tier 3 only 3. What happens if fuel trucks cannot get to your facility to refill the tanks? Some Data Centres claim to hold enough fuel for 90 days of normal operation. This may seem very expensive and sound like overkill, but talks of London running out of power in the near future and past fuel price strikes are leading business continuity experts to recommend larger fuel tanks to minimise potential impact on critical business systems. How vital are your systems and how long could your facility survive?

Migration Solutions are a specialist Data Centre consultancy focusing on design, build, operation and migration of facilities throughout the UK and Europe. Contact us now for advice on how to make your Data Centre or Computer Room more resilient for the future at www.migrationsolutions.com

Tuesday, 21 July 2009

Will BitTorrent ever beat the Data Centre?

With the Pirate Bay and Kazaa deciding to go legit, the way we access music, films and software could be about to change. Peer to peer (P2P) transfers of files, legal or not, has boomed since Napster's break through in technology in 1999. The introduction of BitTorrent's in 2001, a file sharing protocol that allows millions of computers around the world to share files quickly, has made sharing files easy and fast. The technology works by taking the original file from a user, and when it is downloaded, splitting that file into many small chunks. When there are multiple users downloading (or providing) a file, BitTorrent allows people to access these little bits from different people, which minimises the amount of bandwidth traffic that the original 'seeder' has to use to share the original file. The protocol ensures that the rarest part of the file is provided first to ensure high availability. As a result, large files can be downloaded very quickly, significantly faster than a normal website download could be achieved with multiple downloaders.

The popularity of this method of data transfer has exploded and now, although there are conflicting reports of actual amounts, around 30-50% of all internet traffic is thought to be from BitTorrent up and downloads. IsoHunt has claimed that there are 1.7 petabytes (1,000,000 gigabytes) being shared by their sources. Although BitTorrent downloading is popular, it cannot guarantee a connection to the file you require.

When you download a song from iTunes, you are connecting to their server in a Data Centre directly. You have a guaranteed connection speed from the Data Centre via a fibre optic cable. The only limitation to the speed you receive that file is the Internet Service Providers (ISP) network and your own personal internet connection. Theoretically, you could connect to the Data Centre at 14 terabits per second via fibre, which would effectively produce instant downloads. The reality is that fibre transmitters limit the possible bandwidth of fibre and that PC's cannot connect directly to a public fibre connection. Additionally, most bandwidth gets limited by connection via copper cables when connecting to a computer as it is a vastly cheaper method of data transfer in a localised area.

When you download a file from a P2P or BitTorrent network, you are connecting to individual PC's in the majority of cases. You are limited not only by your own computer, bandwidth and ISP, but by the person's that you are connecting to. BitTorrent is considerably more efficient at using this method over P2P due to the fact that it downloads many small files from many different users which allows the for an increase in availability. The limit to this technology appears when connecting to another user. If you could guarantee the connection speed there would be less of an issue, but connecting to P2P networks requires reliance on other users having the file you want, and that they have their computers switched on so you can receive it. In a Data Centre environment, a file would always be available.

So will BitTorrents ever replace the Data Centre? The nature of the connection method and the process that allows a BitTorrent file to be downloaded requires many people to have the same file and an active internet connection. There is no guarantee that you can download a file and each file that is not downloaded would lose someone money. While BitTorrent may be great for the free and illegal file downloading, the lack of control would make it hard to charge for the downloads. Aside from this, the majority of Data Centres require servers for their raw computing power and storage capabilities. BitTorrent software cannot recreate this and even if it could, the running costs would be significantly higher and the synchronisation impossible to maintain.

Migration Solutions is a specialist computer room and data centre company offering independent advice on the design, build and operation of data centres and computer rooms. See our website at www.migrationsolutions.com

Apollo 11 Guidance Computer - Has software moved on?

21st July 1969 was a big day for computers! Apollo 11 made history when the first man landed on the moon. Migration Solutions MD, Alex Rabbetts, has been taking a look back at the technology in use at the time and how it has evolved today.

The Apollo Guidance Computer, (Apollo 11 actually had two - one in the command module and one in the lunar module), was very advanced for its time. It had 2k of memory, 32k of storage and a processing speed of 1.024 MHz and it managed to guide Apollo 11 all the way to the moon and back.

Let's put this in context - it had 2kb of memory ... the machine used to write this article has 2.75GB of memory. That's 1,441,792 times more memory! It had 32kb of storage ... the machine used to write this article has 139GB. That's 4,554,752 times more storage! It had a processing speed of 1.024MHz ... the machine used to write this article has 3.4GHz of processing speed. That's 3,320 times faster!

So, what can be concluded from this is that a machine with over a million times less memory, almost 4.5 million times less storage and a processing speed that is 3,400 times slower than the PC in my office managed to fly man to the moon and back! But then my PC does run Windows ...

Taking a quick look at the Microsoft support website and I find the minimum requirements for running Windows XP:

At least 64MB of memory (128 MB is recommended) - 32,768 times more memory that the Apollo Guidance Computer
At least 1.5 GB of storage - 49,152 times more storage than the Apollo Guidance Computer
A 233 MHz processor (300 MHz is recommended) - 233 times faster than the Apollo Guidance Computer

The question therefore must be ... such a low spec computer such as the Apollo Guidance System is capable of flying man to the moon and back, what is my PC with such a vastly superior specification capable of doing?

One could be forgiven for wondering exactly how this relates to data centres! The answer really is in that very large storage number. The IDC estimates that by the end of 2010 the amount of data storage worldwide will reach 1 zettabyte (1,180,591,620,717,410,000,000 bits of data) and it will continue to rise at an exponential rate. The recent Digital Britain report supports this view of exponential rise in the storage of digital media. All this data has to be stored somewhere ... much of it in data centres.

Data Centres require huge amounts of power to drive the storage equipment used for all this data that is produced. There are all sorts of drives to reduce carbon emissions from data centres, ranging from The Green Grid, the EU Code of Conduct to the Carbon Reducation Commitment. Greater efficiency in data centres is the right way to go, but we mustn't forget the software. 1.5 gigabytes just to install an operating system is a massive amount of storage when compared to that of the Apollo Guidance Computer - the real question is whether this vast increase in the amount of code required simply to run the computer is so much better than the system used to send the first man to the moon!

And one final thought ... if we were told that we were to fly to the moon today and that the computer that would get us there and back was running Windows XP or Vista, would we still want to go?

Friday, 17 July 2009

Chillerless Facilities

In the news this week is another of Google's data centres. Their facility in Belgium does not use chillers and traditional air conditioning, but utilises fresh-air cooling to effectively cool the data centre for free. Fresh-air cooling is not a new technology and it is in use by many different facilities around the world, but the clever part is that Google have done away with the need for backup air conditioning units for the few days a year when temperatures rise above the safe operating levels of their computer equipment.

To allow this facility to run at optimum levels, the technology employed had to be redesigned from the ground up. Google already build their own servers which mean they can purpose build them to their specific requirements, using minimal components and reducing the need for expensive casings and multiple drives. These servers can be run at a temperature of 27 degrees centigrade which allows for cooling requirements to be minimal in the first place. The average summer temperatures in Brussels are between 19-21 degrees which allows a 6 degree variance for heat waves or unseasonal hot days. If the temperature in the data centre rises to a level where the IT equipment cannot handle the heat, Google, using its virtualisation technology, starts to transfer the servers activity to different facilities around the world which allows the data centre to cool naturally, restoring a lower temperature and allowing the facility to continue to operate at normal levels. This process is performed automatically by the data centre, which can recognise when it may be over heating. There is also an increasing reliance on local weather forecasting so they can pre-empt any problems that may occur.

As the world moves towards virtualisation as a more efficient method of server utilisation, the question remains, is the Google data centre blueprint the future of the modern computer room? Google have the ability and the financial backing to be able to take a problem, dissect it, and redesign it so it works most efficiently for their requirements. This is their plan for Google OS, but are Googles requirements the same of the rest of the worlds?

Making your Data Centre as efficient as possible will reduce your yearly power bill significantly. Some very small, often free changes can return 5-10% savings in power. Migration Solutions are specialist consultants who created their ERA, Environmental Report and Assessment, to help Data Centre and computer room operators save running costs and extend the life of servers and their support equipment. For more information visit www.migrationsolutions.com or call 0845 251 2255

Thursday, 9 July 2009

Wind Power

The Energy Saving Trust (EST) has released a report this week about domestic wind generation, which could change how and where wind turbines are installed. The report, which sampled a number of homes across the UK, used predicted and actual average wind speed readings to map the UK and decide where the most appropriate location for wind turbine installations actually are. Although the ideal position and geographical location was no surprise, findings about return on investment (ROI) and home installations were revealing.

Manufacturers figures for domestic wind turbines suggest that on average, building mounted turbines make 10% of their ratings a year and free standing turbines make 17% of their ratings. If a turbine is rated at 1kW, this would equate to 100 Watts and 170 Watts respectively. In direct comparison, the large wind farms get closer to 30-40% of their rating on average, reaching 59% efficiency at peak output. The EST's findings revealed that building mounted turbines actually came closer to being 3% of their ratings, a difference of 7%. This would mean that a £4,500 1kW wind turbine would generate 262.8 kWh of power per year, which equates to £34. Therefore the ROI for a 1kW wind turbine mounted on a domestic house in an average area would be 132 years!! This therefore is not a very environmentally friendly method of power production! If they are located in an ideal location such as Scotland where it is windy, with few obstructions and either by the sea on on a hill, this average rises to 7.4% of the rating. This equates to 648.2 kWh for the same turbine per year or £84. This lowers the ROI to 53 years and 5 months!

On the other hand, due to the ideal location of free-standing turbines, they have a much higher efficiency and much lower ROI. Free standing turbines need a large area of open ground so that they are safely positioned and not a danger. They should ideally be in a high position with no trees or buildings obstructing the wind flow. In the EST's tests, the average load was 19% of the turbines rating which almost certainly is because of their location. The benefit of a free-standing turbine is that it can be larger than one that is building mounted. So for example, a 6kW free standing wind turbine costs £17,200 to buy. At 19% of its rating it will generate 9,986.4 kWh a year or £1,298. Therefore, the ROI would be 13 years 3 months. In perfect conditions either by the sea or on top of a hill, the efficiency rises to 30% which makes the ROI 8 years 5 months. After this period it will start giving you free electricity.

So what has this got to do with data centres I hear you cry? Reducing power costs in data centres is big money, and many new facilities are trying ways to make their data centres or computer rooms "green". One idea is to have private power generation through wind and solar generation, but as was discussed before in this blog, solar power generation is not at a level yet where the benefits can be gained in the UK. If you have a data centre or building located in a remote area which has high average wind conditions, installing 10 free standing wind turbines could power four 3kW racks. Although the ROI would be a major factor, for a domestic situation, the government will provide a grant of up to £1,000 per kW of turbine installed and similar incentives are provided for businesses. In Dubai a number of different designs are being considered by architects to build wind generation into new skyscrapers. These could produce 1,200,000 kWh of power a year, enough to power a 20 Rack data centre with cooling included. If every new building had this technology included in the design, the possibilities of "free power" are endless.

Making your Data Centre as efficient as possible will reduce your yearly power bill significantly. Some very small, often free changes can return 5-10% savings in power. Migration Solutions are specialist consultants who created their ERA, Environmental Report and Assessment, to help data centre and computer room operators save running costs and extend the life of servers and their support equipment. For more information visit www.migrationsolutions.com or call 0845 251 2255

Wednesday, 8 July 2009

Google to re-invent the netbook OS

ChromeOS - could this be the new face of cloud computing? Google announced on their blog last night that they were developing a totally new operating system that will be specifically designed for netbooks. Chrome OS is expected to be ready for shipping towards the end of 2010 but will be available for the open source community later this year.

Google has revealed that they are planning to include in-built security. This has been totally redesigned in order to do away with processor hungry anti-virus and malware products, and to ensure that from switch on to internet browsing there is only a a few seconds delay rather than the few minutes needed by Windows. By starting from scratch, Google hopes to be able to redefine the operating system into something that will "just work" rather than a tool that constantly has problems. All web-based applications will work on ChromeOS and Google is concentrating on ensuring that all the ChromeOS apps are usable on all other operating systems.

This could be the future of operating systems - to give users what they want rather than what they are given. One question remains, although there are many people who disapprove of Microsoft's strangle hold on the market, will Google start to carry the torch?

Monday, 6 July 2009

Super-Data Centres

Has anyone got a spare million lying around? How about cheeky $2 billion?

The NSA (National Security Agency) in America, has announced plans that they intend to build a $2 billion, 1 million sq ft data centre in Utah as they are fast growing out of their Fort Meade headquarters. The proposed 65 MW facility will be sited on 2 major power corridors and will house new supercomputers that the Fort Meade facility can no longer support. The NSA, who watch and listen to most communications - via the internet, phone calls, radio broadcasting and other communications methods. They watch over the whole world's communications, attempting to spot dangers to the USA from foreigners. An element of this information is shared with GCHQ, the UK's version of the NSA. The NSA build bespoke supercomputers to break ciphers and encryption around the world, which require huge amounts of power and cooling to operate effectively. As an example, the world's most powerful supercomputer, the IBM Roadrunner takes up 296, 42U racks and uses 2.53MW of power, not including the water-cooled CRAC units.

In the UK we can presume that GCHQ must have a similar sized computing capacity. There is not much information as you would expect from a highly classified branch of the government, but they have stated that “It’s hard for an outsider to imagine the immense size and sheer power of GCHQ’s supercomputing architecture.” This does beg the question, does the government/GCHQ have access to vast amounts of power that is not readily available to the public and average data centre operator. There is a severe shortage of power in the UK which has been highlighted through the reserving of electricity for the Olympics by many companies who have been paying vast sums of money in a bidding war to secure the limited power available. The government has a plan in place to build a number of new nuclear power stations and clean coal powered stations (which will include carbon capture) by 2018, but many sources are saying that the UK will run out of power by 2015 as old power stations come to the end of their lives.

The best way to safeguard your data centre or computer room for the future is to make it more efficient now so that you will not require excessive power in the next 10 years. Migration Solutions are specialist Data Centre consultants who design and build, operate and migrate data centres all over the UK and Europe and have recently won Information Age's Data Centre Innovation award for ERA, an Environmental Report and Assessment which looks at ways that computer room and data centre operators can cut costs and improve their facilities efficiency. For more information visit www.migrationsolutions.com or call 0845 251 2255.

Friday, 3 July 2009

UK Government says more data centres needed!

According to the widely publicised 'Digital Britain'report, it is anticipated that 'the volume of digital content will increase 10x to 100x over the next 3 to 5 years' and that 'we are on the verge of a "big bang" in the communications industry that will provide the UK with enormous economic and industrial opportunities.' The question posed in the report, however, is where is all this data going to be stored? The answer, of course, is in data centres.

The report continues; 'All of the information on the global Internet, whether for commerce, industry or consumer consumption, has to be stored somewhere in digital form on servers. This is the function of the Data centres. They are a crucial part of the underlying infrastructure and a vital foundation block of much of the digital economy.'

'The current demand for highly-connected data centres in the UK points to constraints in supply which is of concern as these facilities can take up to two years to build from initial inception. The private sector needs to look beyond the current recession since the up-turn in the economy will not be the only driver of expanding demand - the quantity of information to be stored continues to rise exponentially across the world.'

Those involved in the industry already know about the shortage of supply, but if the report is only half right, this shortage will soon become a drought. Putting this in context, the introduction to the report states, 'Yesterday, 20 hours of new content were posted on YouTube every minute, 494 exabytes of information were transferred seamlessly across the globe, over 2.6 billion mobile minutes were exchanged across Europe, and millions of enquiries were made using a Google algorithm.'

Assuming a rather conservative view is taken and that digital content increases just 5 times over the next 3 to 5 years, this would result in 100 hours of YouTube content being posted every minute, 2,470 exabytes of information being transferred and 13 billion mobile minutes being exchanged across Europe. This would result in the current shortage becoming chronic!

It is evident, not just from the report, but also from what is happening today, that new data centres must be built ... and soon! The average data centre will take 6-12 months to construct (some longer) and that's only after all the relevant permissions have been obtained. Building a data centre isn't that easy either. In order to support all of the technology - servers, storage, network equipment etc., there is a significant amount of infrastructure required. Firstly, any site that is identified as a potential site for a data centre needs power ... lots of it! And there isn't much of that around at the moment. Some towns and cities are reported to be down to their last megawatt of power, and that isn't nearly enough to power a data centre of any size. A data centre of just 100 racks of equipment is likely to need at least 1 megawatt of power, the average commercial data centre will have upwards of 500 racks ... some will run into thousands. A second consideration is the communications required. These data centres need big communications links. Not a 200Mbps connection as is currently being trialled in Kent, but several Gigabit, or even Terrabit, connections will be required. Getting these connections to the data centre is not cheap!

Getting the design right is going to be crucial. It isn't, as some newer entrants to the market seem to believe, a case of chucking a load of power and communications into a shed and calling it a data centre. Very careful consideration needs to be given to the operation of these data centres after the builder has gone. It needs specialist design with the ongoing operation at the forefront of that design. In some existing data centres an upgrade will be necessary, in others it won't be possible as their design and infrastructure simply won't support the new demands.

The report states, 'London is the largest data centre market in Europe and a location for international businesses looking to expand into Europe.' This is currently true, but the clever people are going to be thinking slightly outside the box. London is the largest data centre market in Europe but with the chronic shortage of power in London at the moment and the lack of investment in infrastructure meaning it won't get better anytime soon, London may not be the panacea to the problem. The UK certainly, but the clever money will be building outside of the capital. Some will need to be close enough to allow synchronous replication of data, but most applications don't need this - digital storage being one.

Smart operators should be looking to build data centres in locations where both power and connectivity are available. Really smart operators should be looking at sites where both power and connectivity are available and where the power comes from a 'green' source. Whilst the government, through the Digital Britain report, has rightly identified that there is a need for more data centres and that there is a current shortage which can only get worse, they have, as ever, balanced this with the introduction of a promise to heavily tax those users of significant energy, or which data centres are one.

Interesting times are ahead for the data centre industry!

Migration Solutions has many years of experience in computer room and data centre design and construction. Utilising operational experience to create a data centre that runs as efficiently as possible, but is easy to use for the operations team is the most important aspect of any data centre or computer room design. For more information visit www.migrationsolutions.com