Posts Tagged ‘extreme data’

DatacenterDynamics 5TH Annual Data Centre Leaders’ Awards is Now Open for Entries

Friday, September 30th, 2011

For a fifth year a Data Centre Leaders’ Awards will continue to recognize value in information core pattern and operation and to prerogative a professionals responsible.


This year a industry’s many prestigious accolades have been stretched to underline 15 opposite categories adn these are open to entrants from anywhere within Europe, a Middle East and Africa.

Chair of a Judging Panel, Zahl Limbwala, pronounced “Data centers are executive to complicated life. We should be celebrating a creation and skill a industry. After all, it has literally remade itself over a final decade to accommodate elaborating IT, environmental and cost pressures. I’m gay to be invited to play a partial in a judging process; a Datacenter Leaders Awards are always closely contested since of a high levels of intelligent meditative that goes into assembly today’s uninterrupted information core requirements.”

The Awards are also innovating with a introduction of new categories including an endowment for Data Centre Blueprints and a ‘MOST EXTREME DATA CENTRE DEPLOYMENT AWARD.” George Rockett, of DatacenterDynamics, said: “These says people are putting information centers everywhere! We wish to recnognise that and move to a courtesy of a open some of a conspicuous engineering feats and roughly unimaginable places in that vicious infrastructure is being done operational.” 

The Awards will be presented during a Gala dusk on Dec 15th during a London Lancaster Hotel and be hosted by Broadcaster and Journalist Andrew Neil.

Article source:

A Call for Entries as a 5TH Annual Data Centre Leaders’ Awards is Announced

Sunday, September 18th, 2011

The Awards are also innovating with a introduction of new categories including a intriguing “Most Extreme Data Centre Deployment Award.

London, UK (PRWEB UK) 17 Sep 2011

Recognising value in information centre pattern and operation, a Data Centre Leaders’ Awards 2011 facilities 15 opposite categories embracing opposite sizes and forms of information centre as good as a dedicated people and teams obliged for their pattern and operation.

Chair of a Judging Panel, Zahl Limbuwala, pronounced “Data centres are executive to complicated life. It’s right therefore that we should accumulate to applaud a creation and skill of an courtesy that has remade itself to accommodate ever-evolving IT, environmental and cost pressures. Judging a Awards is so severe since of eye-catching levels of intelligent meditative on arrangement as courtesy professionals go about ensuring that their information centres are always on for business.“

The Awards are also innovating with a introduction of new categories including a intriguing “Most Extreme Data Centre Deployment Award”. George Rockett of DatacenterDynamics said, “These days people are putting information centres everywhere! We wish to recognize that and move to a courtesy of a open some of a conspicuous engineering feats and roughly unimaginable places in that vicious infrastructure is being done operational.”

The 5th Annual Data Centre Leaders Awards demeanour set to symbol nonetheless another year of certain and healthy foe among a good established, a adult and entrance and tiny companies within a sector. The prestigious awards rite will take place during a Lancaster London on Thursday Dec 15th 2011.

A call for entries is released for a following Award categories;

➢    Future Thinking Design Concepts

➢    Data Centre Blueprints

➢    Innovation in a Micro-Data centre

➢    Innovation in a Medium-Data centre

➢    Innovation in a Mega-Data centre

➢    Leadership in a Public Sector

➢    Innovations in Outsourcing

➢    The ‘Green’ Data Centre

➢    Improved Data Centre Energy Efficiency

➢    Innovation in IT Optimisation

➢    The Most Extreme Data Centre Deployment

➢    Young Mission Critical Engineer of a Year

➢    Data Centre ‘Special Assignment’ Team of a Year

➢    Data Centre Business Leader of a Year

➢    Outstanding Contribution to a Industry

For serve sum of a 5th Annual Datacentre Leaders Awards 2011 together with endowment criteria and on-line entrance forms, greatfully revisit

Press Contact:

Damien Wells

Spa Communications


+44 (0) 7900 302102

Entries Judging:

Nick Parfitt



+44 (0) 207 426 7835

Sponsoring Bookings:

Steve Foley



+44 (0)20 7426 4815



Article source:

Microsoft’s, Google’s large information skeleton give IT an edge

Wednesday, August 10th, 2011

CIOs are looking to lift actionable information from ultra-large information sets, or big data.
But large information can meant large bucks for many companies. Public cloud
providers could make a large information dream some-more of a reality.

[Researchers] wish to manipulate and share information in the

Roger Barga, designer and organisation lead, XCG’s Cloud Computing Futures (CCF) team

Big data, which
is totalled in terabytes or petabytes, mostly is comprised of Web server logs, product sales, data,
and amicable network and messaging data. Of a 3,000 CIOs interviewed in IBM’s
Essential CIO
study, 83% listed an investment in business analytics as a tip priority. And
according to Gartner, by 2015, companies that have adopted
big information and impassioned information management
will start to outperform confused competitors by
20% in each permitted financial metric.

Budgeting for large information storage and a computational resources that modernized analytic methods
require, however, isn’t so easy in today’s malnutritioned economy. Many CIOs are branch to open cloud
computing providers to broach on-demand, effervescent infrastructure platforms and Software as a
. When referring to a companies’ hunt engines, information core investments and cloud
knowledge, Steve Ballmer said, “Nobody plays in large data, really, solely Microsoft
and Google.”

Microsoft’s LINQ Pack, LINQ to HPC, Project “Daytona” and a stirring Excel DataScope were
designed categorically to make large information analytics in Windows Azure permitted to researchers and
everyday business analysts. Google’s Fusion Tables aren’t set adult to routine large information in a cloud
yet, though a app is unequivocally easy to use, that expected will boost a popularity. It seems like the
time is now to ready for impassioned information government in a craving so we can outperform your
“unprepared competitors.”

LINQ to HPC outlines Microsoft’s investment in large data
Microsoft has dipped a toe in a large information syndication marketplace with Windows Azure Marketplace
DataMarket. However, a company’s vital investments in cloud-based large information analytics are
beginning to emerge as revenue-producing program and services candidates. For example, in June
2011, Microsoft’s High Performance Computing (HPC) organisation expelled Beta 2 of HPC Pack for Windows HPC
Server 2008 clusters and LINQ to HPC R2 SP2.

Bing hunt analytics use HPC Pack and LINQ to HPC, that were called Dryad and Dryad LINQ,
respectively, during several years of growth during Microsoft Research. LINQ to HPC is used to
analyze unstructured large information stored in record sets that are tangible by a Distributed Storage Catalog
(DSC). By default, 3 DSC record replicas are commissioned on apart machines regulating HPC Server
2008 with HPC Pack R2 SP2 in mixed discriminate nodes. LINQ to HPC applications, or jobs, routine DSC
file sets. According to David Chappel, principal, Chappell Associates, a components of LINQ
to HPC are “data-intensive computing with Windows HPC Server” total with on-premises hardware
(Figure 1).

Figure 1. In an on-premises installation, a components of LINQ to HPC are widespread opposite the
client workstation, a cluster’s conduct node and a discriminate nodes.

The LINQ to HPC customer contains a .NET C# or VB plan that executes LINQ queries, that a LINQ
to HPC Provider afterwards sends to a conduct node’s Job Scheduler. LINQ to HPC uses a destined acyclic
graph information model. A graph
is a request database that uses family as documents. The Job Scheduler then
creates a Graph Manager that manipulates a graphs.

One vital advantage of a LINQ to HPC design is that it enables .NET developers to easily
write jobs that govern in together opposite many discriminate nodes, a conditions ordinarily called an
“embarrassingly parallel” workload.

Microsoft recently folded a HPC business into a Server and Cloud organisation and increasing its
emphasis on regulating HPC in Windows Azure. Service Pack 2 allows we to run discriminate nodes as Windows
Azure virtual machines
. The many common pattern is a hybrid cloud mode
called a “burst scenario” where a conduct node resides in an on-premises information core and a number
of discriminate nodes run as Windows Azure VMs — depending on a bucket — with record sets stored on
Windows Azure drives. In LINQ to HPC, customers
can perform data-intensive computing
with a LINQ programming indication on Windows HPC Server.

Will “Daytona” and Excel DataScope facilitate development?
The eXtreme Computing Group (XCG), an classification in Microsoft Research (MSR) determined to push
the bounds of computing as a partial of a group’s Cloud Research Engagement Initiative, released
the “Daytona”
, as a Community Technical Preview (CTP) in Jul 2011. The organisation rested a project
later that month.

Daytona is a MapReduce runtime for Windows Azure that competes with Amazon
Web Service’s Elastic Map Reduce
, Apache Foundation’s Hadoop Map Reduce, MapR’s
Apache Hadoop placement and Cloudera Enterprise Hadoop. A vital advantage of “Daytona” is that
it’s easy to muster to Windows Azure. The CTP includes a elementary deployment package with pre-built
.NET MapReduce libraries and horde source code, C# formula and illustration information for k-means clustering and outlier showing analysis, as
well as finish documentation.

“Daytona has a unequivocally simple, easy-to-use programming interface for developers to write
machine-learning and data-analytics algorithms,” pronounced Roger Barga, an designer and organisation lead on
the XCG’s Cloud Computing Futures (CCF) team. “[Developers] don’t have to know too many about
distributed computing or how they’re going to widespread a mathematics out, and they don’t need to
know a specifics of Windows Azure.”

Barga pronounced in a write talk that a Daytona CTP will be updated in eight-week sprints.
This interlude parallels a refurbish report for Windows Azure CTP during a after stages of its
preview in 2010. Plans for a subsequent Daytona CTP refurbish embody a RESTful API and performance
improvements. In tumble 2011, we can design an ascent to a MapReduce engine that will capacitate the
addition of tide estimate to normal collection estimate capabilities. Barga also pronounced the
team is deliberation an open-source “Daytona” release, depending on village seductiveness in
contributing to a project.

In Jun 2011, Microsoft Research expelled Excel
, a newest large information methodical and cognisance candidate. Excel DataScope lets
users upload data, remove patterns from information stored in a cloud, brand dark associations,
discover similarities between datasets and perform time array forecasting regulating a familiar
spreadsheet user interface called a investigate badge (Figure 2).

Figure 2. Excel DataScope’s investigate badge lets users entrance and share computing and storage on
Windows Azure.

“Excel presents a sealed worldview with entrance usually to a resources on a user’s machine.
Researchers are a category of programmers who use opposite models; they wish to manipulate and share
data in a cloud,” explained Barga.

“Excel DataScope keeps a event Windows Azure open for uploading and downloading information to a
workspace stored in an Azure blob. A workspace is a private sandbox for pity entrance to information and
analytics algorithms. Users can reserve jobs, undo from Excel and come behind and collect adult where
they left off; a swell bar marks standing of a analyses.” The Silverlight
provides Excel DataScope’s information cognisance feature. Barga expects a first
Excel DataScope CTP will dump in tumble 2011.

Google Base is a goner, though Google Fusion Tables binds hope
Google Base was a initial Web-accessible, non-relational information government systems formed on the
company’s BigTable technology. There was a flurry of seductiveness when Google introduced a Base beta
in 2005, though many early users became annoyed with a limiting schema and poor

I initial uttered my Google Base
after perplexing to use it as a general-purpose cloud information store. Google changed Base
into a Merchant Center as a information store for Google Product Search in Sep 2010 and sounded
Base’s genocide knell that by year’s finish — when it unheeded a API in preference of a new set of
Google Shopping APIs.

Google Fusion Tables. The giveaway beta chronicle of Google Fusion
, that was introduced on Google Labs in 2009, lets users upload and download *.csv files
with a limit of 100 MB per dataset and 250 MB per user. Users can share files with a open or
other designated users. However, these storage boundary are too low to work with prolongation Fusion
Tables for large information projects; a product’s storage boundary would need to be stretched by during least
four orders of bulk to get to that point.

Users can filter and total a data, as good as daydream it with Google Maps or other methods offering by a Google
Visualization API
. Fusion Tables also capacitate information set and particular object annotation; users can
also join, or fuse, tables on according to primary pivotal values.

Nobody plays in large data, really, solely Microsoft and

Steve Ballmer, CEO, Microsoft

According to a Nov 2010 Google
Operating Systems post
‚ Fusion Tables “graduated” from Google Labs in Sep and will be
included in a Google Docs app. “Google Docs includes alloy tables in a list of request types
and there’s also an icon for fusion
tables. Users can already import tables from Google Spreadsheets and sharing
works only like in Google Docs.” “Graduation” means Fusion Tables should shun Larry Page’s, “More timber behind
fewer arrows
” fatwa of Jul 2011, that sealed down Google Labs.

The many engaging use of Fusion Tables involves formation with a Google Maps JavaScript API
to daydream geocoded data. In “Digging
a small deeper into Google Fusion Tables – A technical GIS perspective
,” program developer
Viswaug noted:

“The ‘Map’ and ‘Intensity Map’ list information should be of special
interest to all a GIS folks. It creates a routine of mapping information genuine easy. The ‘Location’ field
type in Fusion Tables supports both travel residence strings and KML fibre illustration of
geometries. The travel addresses entered into a plcae margin get automatically geocoded and are
viewable on a map visualization.”

Figure 3 shows a geographic placement of Mexico
drug fight murders given Dec 2006
from one of a 107 open datasets permitted from the
Fusion Tables site.

Figure 3. The Google Fusion client’s Visualize menu offers a choice of 10 visualization

Another Fusion Table plan includes 2010 U.S. Election Ratings
that lets users select senate, residence and governance races with projections from a operation of

Roger Jennings is a data-oriented .NET developer and writer, a principal consultant of OakLeaf
Systems and curator of a OakLeaf
Systems blog
. He’s also a author of 30+ books on a Windows Azure Platform, Microsoft
operating systems (Windows NT and 2000 Server), databases (SQL Azure, SQL Server and Access), .NET
data access, Web services and InfoPath 2003. His books have some-more than 1.25 million English copies
in imitation and have been translated into 20+ languages.

This was initial published in Aug 2011

Article source: