Friday, 29 May 2009

How good is wind?

The Scottish government has officially switched on the largest onshore wind farm in Europe. The site on Eaglesham Moor, which takes up an area the size of central Glasgow, has 140 turbines which will power 250,000 homes. The owners, ScottishPower Renewables, have been given further permission to increase the site by 25% to 176 turbines which would have the ability to produce 452MW of power. There are also discussions in place to build an 1800MW offshore wind farm in the west of Scotland. To put this into perspective, 452MW is about 1% of the UK’s electricity consumption, and 1800 MW is about 4.5%.

Data centres in the UK have been said to use around 4% of all power used in the UK. If the offshore wind farm in Scotland gets the go-ahead, these two sites combined would produce enough power for every data centre in the UK. The Whitelee wind farm takes up an area of 55 square kilometres and the UK would require 600 of these to meet its energy policy by 2050, or an area the size of Wales.

The question on many people’s lips is, will energy prices fall as a result of this renewable energy? In America, many IT companies base their data centres around the Colorado River, where hydroelectric power is created in huge volumes and is sold back to the public at a relatively low rate. The electricity prices in Colorado have risen 63% from 2002-2008, this new energy may be clean, but who can be sure how it will affect the price we pay in the UK?

Migration Solutions are a specialist data centre consultancy who recently won Information Age's Data Centre Innovation award for ERA, their environmental report and assessment. ERA takes a 360 degree look at your data centre and then suggests areas that you could improve on and could help save the environment while saving you money. For more information visit

Thursday, 21 May 2009

Do they delete your data?

Cambridge University have recently completed a PhD study into Internet security, focusing on what happens to your data that is stored by the numerous social networking sites and Web 2.0 websites.

The study, which was focusing on the process of information removal after you delete an item or account, found that 3 of the biggest social networking sites, Myspace, Facebook and Bebo retained the information after it was deleted. It is possible to gain access to this information though direct links to the pages which have been cached. The big photo sharing websites like Flickr and Photobucket are very careful with this loop hole, ensuring that all your information is deleted when you ask it to be.

The implication for data centres is that all that extra information has to be stored somewhere. For example, on average Facebook receives 250,000 new registrations a day. If we presume that 5% of this number delete their accounts everyday, 12,500 people's personal information and photo's are still held. As Cambridge University showed, the information must still be stored on a Facebook server somewhere, and if the average account takes up 15 MB of room on a server, that is 183 GB a day of information that should be erased. 65.2 TB a year is a lot of data and a significant amount of storage space. Utilising a modern server and storage running at 1.44 kW, this would use over 12MW of power a year and would increase by the same amount each year.

This does not take into account the security risks and personal information protection, but certainly asks a big question; do you know who has your personal data? and more importantly do they actually delete it when requested? Apparently not, and the only people that suffer are you and the environment.

Migration Solutions are data centre specialists. They recently won the Information Age Data Centre Innovation Award for ERA, an environmental report and assessment. ERA investigates all the elements that affect a data centre's operation and gives advice on how to be more environmentally friendly, more efficient and save on energy consumption. For more information visit

Wednesday, 13 May 2009

Uptime Institute to change Tiering

The Uptime Institute plans to open up its data center availability tier standards, with two programs catered toward end users and design engineers.

The Uptime tiers have become the de facto standard for availability in the data center industry. The system includes four tiers that escalate in availability as the number increases, with Tier 4 being completely fault tolerant. Uptime has tried to rein in the standards, as many data centers have claimed a certain tier availability without official certification from Uptime. On the other side, some have questioned the relevancy of the tier standards, saying that putting them to practical use can be as difficult as solving the Da Vinci Code.

(Excerpt taken from

The real problems with the Uptime Institute's tier classifications are in my opinion, three fold.

Firstly, the standards are written by Americans for the US and half of them are not applicable to the rest of the world. They are written in imperial measures which do not apply to the vast majority of the rest of the world and are based on 110 volt power, which is not available in the rest of the world. The standards clearly state that in order to comply with any tier, you must comply with ALL requirements of that tier - i.e. if you want to be Tier 2, you must comply with all of the requirements of Tier 2. The problem here is that outside of the US, it is impossible to comply with all of the requirements for any tier.

Secondly, the standards are very, very widely misused. Claims are made of being 'Tier 3' or 'Tier 4', but what these data centres really mean is that they have some vague relation to the levels of resilience and redundancy that are required by a particular 'tier'. Very few people have actually read the standards and when challenged will only say that they are 'inline with industry peers'. Being a US based organisation, it is very difficult for any data centre outside of the US to obtain a tier classifcation as the Uptime Institute will not licence others to certify data centres and either do not have the resources to do it themselves, or do not have the appetite. The Uptime Institute say that they have only certified 'about a dozen' data centres, which tells us something about those who claim to be 'tier' anything!

Thirdly, whilst the Uptime Institute may be a 'not for profit' organisation, the fact is that it is a company. This same company has a subsidiary called 'Uptime Technologies', which is very much a 'for profit' organisation, selling products into the data centre industry. This calls into question the independence of the Uptime Institute as they surely must be tempted to introduce requirements that their subsidiary company just happens to have products to fulfill.

The advocation of standards is a good thing. However, these should be defined by an International Standards body that is trully independent and does not have a commercial subsidiary. These standards should be such that they account for differences between geographical locations and what can be achieved and should have at their heart the ultimate goal - that is, to have a standard that defines a level of availability that includes the infrastructure, the building and all the other areas that are currently covered (in a simpler fashion), but also includes the architecture of the technology. It will be interesting to see how this one develops!

Migration Solutions is a vendor independent computer room and data centre specialist company. Take a look at our website at

Friday, 8 May 2009

Virgin Media Trials 200Mbps Service

Virgin Media is about to trial 200Mbps broadband. Great for consumers, but it will have a massive affect on data storage not just because of the new services that will be on offer. HDTV, 3D TV and other bandwidth hungry services will be on offer. The BBC’s recently launched HD iPlayer service will be readily available and other services such as You Tube will only increase the amount of HD content.

All this data needs to be stored somewhere, and more commonly content is being stored locally (by the service provider) to improve download speeds. In other words, these very large files are being duplicated left, right and centre.

These massive amounts of data all has to be stored somewhere – increasing the demand for more and more data centre space.

It’s not just what’s available we need to think about. More bandwidth for consumers will enable them to upload increasing amounts of content – videos, PC backups etc at lightening speeds. No one will think twice about doing it, and in our experience, people are very good at creating their DVD masterpieces and their weekly backups – but not so good at deleting data. So the amount of data to be stored will increase exponentially. Purging policies exist to some degree in most companies (but don’t go nearly far enough – what about all those funny emails that never get deleted and copies sent to anyone we can think of?) Training the public at large to purge their data is going to be an uphill task.

This is a serious issue. The data centre ‘industry’ is trying to reduce its environmental impact. With companies profiting from the amount of data stored by individuals, and by viewing larger and larger files there is no incentive for them to reduce this impact.

The time is fast approaching where more data will be stored for ‘leisure activities’ than for business activities which should be investigated as the percentage is most likely swinging more and more to the home user leisure activities – who are incredibly difficult to control.

Migration Solutions is a specialist computer room and data centre specialist offering independent advice on the design, build and operation of data centres and computer rooms. Visit our website at

Thursday, 7 May 2009

Swedish Carriers

The Swedish Ombudsman has ruled that local mobile carriers must advertise the actual attainable internet speed figures rather than the traditional "up to" figures. The ruling came after it was decided that advertising a speed that is very rarely attainable is misleading to the public and makes choosing your broadband supplier an easier job. When you order a 10Mbps connection that is what you expect to get, but many people end up receiving under 20% of this, depending on their distance from the distribution point.

In the UK, the Channel 5 show "Gadget Show" have had a campaign running for 18 months to get British users to complain to their suppliers if they are getting significantly below the "up to" figure. They believe it is misleading for the public. When you buy a car that can do 155mph, you don't expect to reach its top speed at 31mph!

With a major ruling in a European country, we may expect to see the same laws applied throughout the EU which would be beneficial to home users and small business alike. A slow connection to your server in a co-location facility renders it near useless when you need to access you equipment quickly. Paying more for extra bandwidth from the co-lo side cannot help you until you have a guaranteed speed to you PC or work network.

Migration Solutions is a specialist computer room and data centre specialist offering independent advice on the design, build and operation of data centres and computer rooms. See our website at

Tuesday, 5 May 2009

Choosing a Co-Lo

Does your co-lo match your needs? Do you know what your needs are? Do you know what is best for your data?

When choosing a facility to host your IT equipment, be it one 1U server or 30 racks of high density storage, it is crucial to choose a data centre that matches your requirements. There are so many companies out there offering services that appear very similar, but are very different once you scratch the surface.

Do you need access to your server? So many people host their servers in the centre of London or in the surrounding area, paying a premium to be close to their server(s). Could you host your server in Iceland for example and take advantage of cheaper power and cooling costs - would latency be an issue? If you went down this route, would you have the confidence to know that you are getting a service that you have never seen or to do the research to get the level of hosting that is closest to your requirements?

When talking about facility location, do you consider its security implications? Major towns and cities suffer from increased security risks from accidental outages from non-related infrastructure maintenance and potential terrorist attacks (not just to the facility but near-by). From a power point of view, brown-outs and power cuts are becoming a greater problem in major cities as the present supply is stretched with our increasing demands.

How redundant are the co-location facilities? Do they have multiple data (telecoms) and power feeds, on-site generators, N+1, N+2, N+N equipment duplication? If they claim that they have redundancy in their water-cooled air conditioning, does this extend to the pipes, water pumps and water tanks?

The question that you need to ask is "What are you paying for?". When you want 5kW of equipment in 1 rack, do they charge you for 3 racks because they only have power provision for 2kW per rack? If they charge you for 3, is your equipment spread between 3 racks or put in one, and is there a cooling implication linked with this?

There are so many questions that you need to ask when hosting your equipment, and it can be a very daunting process whether you know data centres inside and out or have no experience in them at all. Migration Solutions are data centre consultants that specialise in performing due-diligence on behalf of their clients to ensure they have the best fitting, best priced solution based on their specific needs. Extensive experience in data centre design and operation allow Migration Solutions to help you get exactly what you want and what you need first time, minimising the stress of the selection process. For more information or to find out how they can help you visit or call now on 0845 251 2255