Posts Tagged ‘group vice president’

Infrastructure fondness inflates cloud prospects

Saturday, November 5th, 2011

The Open Data Centre Alliance (ODCA) has expelled a slew of announcements in support of a prophecy that cloud deployment numbers will triple in a subsequent dual years. 

Based on predictions from a networking, storage and craving government members’ cloud deployments, it also pronounced their cloud operations adoption would be 5 times faster than extended marketplace forecasts.

It compared a adoption rate to that of researcher IDC, that likely in a second entertain this year a tellurian marketplace for open cloud IT services would surpass $90 billion (£56 billion) by 2015.

“A tripling of cloud operations is 5 times faster than a many stream forecasts of extended cloud services growth,” settled Matthew Eastwood, organization clamp boss and ubiquitous manager of IDC’s Enterprise Platform Group.

“The organization has constructed a singular event for accelerated, patron prioritised innovation, that is reflected in this assertive foresee of cloud adoption.”

The ODCA also announced a further of HP and Computer Associates (CA) to a member ranks, that no doubt helped stain a flushed outlook. The organization pronounced it could now count illustration from over 90 per cent of a virtualisation program marketplace and over two-thirds of a server hardware marketplace among a members.

Over 40 per cent of ODCA members were awaiting to run some-more than 40 per cent of their inner IT in cloud environments in a subsequent dual years. More than 30 per cent pronounced tellurian resources (HR), along with authorised and financial applications, were primary targets for craving cloud adoption, while sales and selling collection were cited as tip targets for open cloud implementation.

Marvin Wheeler, ODCA Board of Directors chairman, pronounced a forecasts illustrated how quick a membership was “implementing cloud in inner environments and with use providers”.

The notices are a latest sum to emerge from a ODCA as it moves towards a announced idea of enlivening over $50billion in cloud investment over a subsequent 3 years.

The ODCA also pronounced it would align a cloud confidence declaration and monitoring use models with a cloud review selection and security, trust and registry (STAR) programme of a Cloud Security Alliance (CSA) to settle a customary clarification of confidence for cloud services in a initial half of subsequent year.

Collaboration with a Distributed Management Task Force (DMTF) will concentration on enhancements of a DMTF’s open virtualisation format (OVF) selection to support a ODCA’s practical appurtenance (VM) interoperability use model. Results from this corner work are approaching to simply cloud effort migration, regardless of VM manager or information centre plcae before a second half of subsequent year.

The organization also expelled a initial attention best use paper for cloud focus growth and resiliency to tackle accessibility concerns in a face of new outages. The paper addresses a tip hurdles identified by ODCA members as focus migration, secure association and simplified management.

Clive Longbottom, owner and business routine facilitation use executive of researcher Quocirca, told Cloud Pro he welcomed a pierce towards joining of some of these cloud attention groups.

But he added: “As they say, a smashing thing about (cloud) standards is that there are so many to select from.”

He commended a ODCA for bringing together an considerable list of members, though forked out many also lay on a vast series of cloud bodies, not all of that seem to wish to work together.

“There are still too many opposite bodies covering opposite cloud technologies and approaches that make it formidable for finish users to figure out who’s doing what,” he said.

“For example, is OpenStack a product, an proceed or a group? Why does SNIA (Storage Networking Industry Association) have a cloud organization within it? The DTMF, IEE, W3C, OASIS all have opposite cloud groups, many operative on standards.”

As to a ODCA collaborations, Longbottom concluded: “I wish that it all works out, though they will need to pierce quick to stop a existent fragmentation of cloud, formed on exclusive platforms (such as Amazon, Azure and so on) and take a some-more long-term approach, where workloads can be changed opposite cloud providers in an open, self-service, stretchable and effervescent demeanour (which is flattering most what NIST’s clarification of cloud is).”

Article source: http://www.cloudpro.co.uk/cloud-essentials/2171/infrastructure-alliance-inflates-cloud-prospects

Study shows vital change to cloud computing in manufacturing

Thursday, September 22nd, 2011

Cloud
computing technology
will be a categorical motorist of production IT capability in a coming
decade, a new news from IDC Manufacturing Insights claims. And manufacturers seem to be good on
their way, adopting cloud computing record somewhat faster than other industries: 23% are
already regulating cloud
technology
and another 44% are implementing it or have organisation plans.

“I was a bit astounded that it was as clever as it was,” pronounced Bob Parker, organisation clamp president
at IDC Manufacturing Insights and author of a report, patrician Business Strategy: Cloud
Computing in Manufacturing
. “This is justification of a marketplace that is going to have some lift.”

The commentary come from a Framingham, Mass.-based investigate firm’s research of a survey
completed early this year. Among a 658 respondents, 98 identified themselves as
manufacturers.

More on cloud computing in manufacturing

Read about a Microsoft
cloud ERP framework

Learn about SaaS
supply sequence management

Understand cloud
computing risks

The formula advise cloud computing–broadly tangible to embody all from simple computing
resources to end-user applications and services–will continue to attract a bigger share of
manufacturing IT budgets, as some-more manufacturers pronounced they devise to outsource their IT infrastructure
in a subsequent dual years. The commission of companies handling infrastructure internally will drop
from 55% to 38%, according to a report, with normal outsourcing — in that a third party
owns a hardware and maintains staff — saying a share of IT budgets rising from 14% to 21%. The
report says other industries plan, on average, to keep budgets for normal outsourcing prosaic and
predicts manufacturers will do a same “once a notice of cloud risk is abated.”

Still, cloud computing will outstrip normal outsourcing by then, immoderate around 40% of
manufacturing IT budgets, scarcely relating a commission clinging to inner IT management. Private
clouds are by distant a elite choice, says IDC, with outwardly hosted clouds somewhat more
popular than internally hosted ones. Public clouds will see some expansion and devour 13% of IT
budgets in dual years, according to IDC.

Flexibility, services genius pulling cloud computing investments 
Parker theorizes that pointy mercantile fluctuations are pulling manufacturers toward cloud computing
technology since it lets them supplement and mislay resources instead of over-investing internally to
handle spikes in demand. “Manufacturers, some-more than other industries, have program license
fatigue,” Parker said.

But a consult also suggests Software
as a Service (SaaS)
is commencement to remove a dash as a cheaper choice as companies gain
more knowledge with a genuine costs of mixed SaaS subscriptions, Parker said. Manufacturers are
starting to comprehend that their best, long-term assets opportunities competence instead be in
Infrastructure as a Service (IaaS), that lets them entrance storage, CPU cycles and other basic
resources from a cloud.

Manufacturers are somewhat some-more expected than companies in other industries to see cloud computing
technology as a approach to order IT services, a investigate shows.

“There’s a mutation among manufacturers to do this kind of catalog of services,” Parker
said.

Manufacturers have been strongly shabby by a Information
Technology Infrastructure Library (ITIL) v. 3 best practices
and identical approaches, that call
for building “catalogs” of services that can be replicated and common via an
organization.

“They’re regulating them to get out of plan jail and be some-more suspicion of as a use provider of
IT,” he said.

Manufacturing’s thespian change toward outsourcing comes during a time when companies are
de-emphasizing normal IT jobs and employing some-more information scientists, “anthropologists” lerned to
analyze tellurian function in amicable media, and others with a hybrid IT and business skills to
perform analytics on “big data,” Parker said.

“There’s a mutation going on in terms of a people who work in IT in manufacturing,” he
said. “It has changed from building systems that expostulate processes to examining information to expostulate better
decisions.”

Strategic purpose of cloud computing in manufacturing
Stepping behind to inspect a prolonged term, Parker says cloud computing record will be a most
important height for stability a IT capability improvements of a past decade, that he
said were enabled by such record trends as virtualization, ERP consolidation, software
outsourcing to India and exponential expansion in network bandwidth. “IT costs indeed went adult almost
50% in that time period,” he said. “It’s only that income doubled. [IT] was means to do some-more with
less.”

Cloud computing will be a pivotal to stability these increases in IT function rates while
still pulling down apparatus costs, according to Parker. In a entrance decade, however,
“manufacturing is being asked to urge [IT] productivity, though in half a time,” he said.

The news concludes with these recommendations:

  • Publish a cloud computing position paper that includes a company’s near- and long-term
    plans.
  • Set specific goals for a volume of resources and applications to run in a cloud.
  • Perform an honest comment to safeguard stream staff has a attribute and “service-level
    audit” skills to change to a cloud-centric strategy.
  • Align pivotal business objectives with cloud computing investments.





Article source: http://www.pheedcontent.com/click.phdo?i=f673283dccc336df8690d1fdefc68db8

The Dodd-Frank Act could meant a information government disaster for some

Tuesday, May 3rd, 2011

Financial institutions improved get prepared for new regulatory correspondence mandates stemming from the Dodd-Frank Wall Street Reform and Consumer Protection Act, or they may
soon find themselves traffic with dear information supervision problems, according to experts.

The new correspondence rules, that are being combined to assistance supervision regulators brand “systemic risk” and equivocate another
economic meltdown, have nonetheless to be clearly defined. But there are some pivotal stairs that banks and
financial services firms can take to start scheming now. They embody embracing standards, setting
up a extensive information governance or master data
management (MDM)
program, and conducting a consummate review and remediation of information stores.

For some-more on a Dodd-Frank Act and other regulatory correspondence laws

Learn about the
data confidence implications of financial services regulatory reform

Find out because IT
is endangered about increasing regulations

Learn about a advantages of voluntary
compliance with regulations

“This law is happening,” pronounced Fred Cohen, organisation clamp boss and tellurian control of capital
markets and investment banking during Patni, a Cambridge, Mass.-based IT services and infrastructure
management firm. “It’s not good tangible though there is movement underneath way, and a financial
institutions need to figure out what they’re going to do to respond.”

Patni, that runs a information supervision turn list with member from 15 financial
institutions, recently published a white paper with several recommendations as to what banks and financial
services firms can do to prepared for a ramifications of a Dodd-Frank Act.

“This is only a outrageous undertaking,” Cohen said. “Y2K is going to be a travel in a park compared
to this.”

The Dodd-Frank Act will direct information transparency

The Dodd-Frank Act, that was sealed into law by President Barack Obama final July, aims to give
the supervision larger regulatory control over a financial services attention and repair a systemic
problems that led to a mercantile predicament of 2007-10. The act touches on only about each aspect of
the financial services attention and has been frequently criticized by Republicans as going too far.
In March, Republicans introduced five new bills designed to change and dissolution tools of a financial overhaul
law.

The act creates new regulatory stating agencies, such as a Office of Financial Research and
the Financial Stability Oversight Council, that have been operative with a financial services
industry to come adult with jointly acceptable information stating and research standards. Many of the
mandates associated to information supervision are approaching to be finalized this summer, according to Michael
Atkin, handling executive of a Enterprise Data Management Council, a nonprofit trade organisation that
addresses information supervision issues faced by a financial services industry.

What’s transparent currently is that a Dodd-Frank Act is focused on ensuring a clarity of
information in a handful of pivotal areas, Atkin said. That includes instrument anxiety information or
contractual information associated to financial “instruments” like equities, holds and derivatives;
entity anxiety data, that is used for counterparty risk comment and defines who is doing
business with whom; pricing data; and information about portfolio holdings.

“Those 4 areas make adult a core of a financial exchange routine information base, and
we all have to get those things standardised and comparable,” Atkin said. “All of this unequivocally boils
down to a relations between a instruments we emanate and trade, a companies we do business
with and a obligations that we reason as a outcome of a purpose in a transaction process.”

Getting prepared for a Dodd-Frank Act

The initial thing an classification can do to get prepared for a stirring mandates is control a
thorough review and remediation of information stores, Atkin said.

“They’ve got to purify adult a mess,” he explained. “Then you’d improved start embracing standards
– standards for marker of things, standards for a descriptions of things. In our
industry we have to brand instruments and entities and we’ve got to systematise a exchange so
that we can do some analysis.”

Experts contend a subsequent step is to settle — or reinstate — a clever joining to data
governance and information quality. For some organizations, that could meant finally removing critical about
implementing
an MDM program
.

“All of this [regulatory] activity is managed by governance,” Atkin said. “[Organizations need
to have] their inner tellurian governance and coordination processes in place to get fixing and
manage all of these things.”

Richard Ordowich, a comparison partner with STS Associates Inc., a Princeton, N.J.-based data
quality and information governance consulting firm, concluded that auditing information is a good initial step on the
road to compliance.

“My idea is that organizations start to investigate their data, such as patron data. Do an
inventory, retreat operative a semantics of a information and a manners ruling a usage,” Ordowich
said in an email interview. “Knowing about a information is a initial step in looking for solutions to
regulatory correspondence and last what solutions are possible.”





Article source: http://www.pheedcontent.com/click.phdo?i=66238ad862b987cb3606aa1c12793172

Cisco opens new immature information core in Texas

Tuesday, April 19th, 2011

CBR Staff Writer
Published 19 Apr 2011

Features one cloud computing technology; has a PUE measure of 1.35

Cisco Systems has non-stop a new information core in Allen, Texas, that a association says is versed with one cloud computing record and several appetite potency features.

The new information core is nearby another Cisco trickery in Richardson, Texas. The dual information centers are meant to offer as “active-active” mirrors of one another, providing present backup and synchronised updates of changes during each.

The association pronounced a new information core facilities a extensive information core fabric including Nexus 7000 and 5000 Series switches, Nexus 1000V Virtual Switches, MDS storage networking switches, Data Center Network Manager and NX-OS, a information core handling complement that spans a Cisco information core portfolio.

Cisco pronounced a one fabric saved it over $1m on cabling. The trickery uses additional hardware and program from partners EMC, NetApp and VMware.

The new information core has a PUE measure of 1.35, that means that for each section of energy spent, 35 units are spent on power, cooling and other infrastructure. The trickery takes down 5MW though can scale adult to 10. The bureau spaces are powered by solar panels.

Among other facilities of a new trickery are: ability to withstand hurricane winds adult to 175 mph; use of rotary flywheels in a UPS room; and air-side economiser pattern cooling.

The trickery also uses a plain building and beyond cooling, instead of a lifted building design.

Cisco Data Center, Switching, and Services Group comparison vice-president John McCool pronounced a new information core showcases Cisco’s creation care and a information core architectural coherence to broach any application, in any location, and any scale in a secure and open manner.

Cisco Server Access and Virtualization Group vice-president Soni Jiandani said,”As vicious business assets, information centers currently are undergoing fast record and architectural changes to accommodate and respond some-more fast to elaborating business goals.”

“Innovative Cisco technologies like a Cisco Unified Computing System and Nexus product families are assisting information centers renovate into an flexible and fit networked sourroundings that helps broach information from any device to any content, anywhere, during any time.”

Cisco has practical for Gold acceptance by Leadership in Energy and Environmental Design (LEED) for a new facility.

Article source: http://servers.cbronline.com/news/cisco-opens-new-green-data-center-in-texas-190411

Oracle drops Itanium development

Friday, April 15th, 2011

Oracle will stop growth for Intel’s Itanium chip.

More on Itanium

Itanium
processor delays
put HP’s skeleton in question

New Intel
Itanium processor
halts passed chip talk

Microsoft Windows
drops Itanium

In what might be a program giant’s slightest argumentative pierce lately, Oracle pronounced currently that it
is slicing attract on Itanium program development.  With that move, it follows Red
Hat
and Microsoft, that final year pronounced they would proviso out Itanium support for their server
operating systems.

For example, Microsoft final Apr pronounced Windows
Server 2008 R2
will be a final server OS to support Itanium.

Industry observers were not astounded during Oracle’s move. “It’s rather approaching news. Itanium has
always been a low-volume processor.  It doesn’t warn me during all that Oracle would rather
invest a investigate and growth monies elsewhere,” pronounced eccentric researcher Dan Kusnetzky of
Kusnetzky Group LLC.

HP offers several Itanium servers, that paint a infancy of a Itanium-based segment,
and some observers expel this latest news in a context of stability friction
between HP and Oracle
.

“This will substantially piss off HP, though no one else has any skin left in this diversion as distant as I
know,” pronounced a CTO of a vast Midwestern financial institution.

But Matt Eastwood, organisation clamp boss of craving servers during IDC, pronounced Oracle’s decision
represents a double customary for Oracle.

Oracle CEO Larry Ellison “made a large understanding observant that HP and Oracle need to mount by their
mutual customers, and now Oracle seems to be forcing those business to make a choice,” Eastwood
said.

That preference might finish adult assisting IBM some-more than Oracle, he added.  “IBM has a strongest
non-x86-based roadmap, in my opinion,” he noted.

HP substantially creates 80% of a Itanium servers on a market, with a rest entrance from Fujitsu
and Hitachi, Eastwood said.

The Itanium architecture, once also famous as IA-64, was primarily grown by HP and then
became a corner growth plan by HP and Intel. But in late 2004, HP relinquished a partial in
developing and production a chip
, ceding that work, and a Itanium engineers, to
Intel.

Intel announced a latest Itanium chip, the
Tukwila
, final year, though renouned support for a chip has flagged in a face of a Intel x86
behemoth. HP still offers several Integrity and Superdome servers formed on Itanium.

Early Itanium guarantee faded

In a early partial of a final decade, Itanium showed good promise. “People forget that Itanium
was once a really large understanding — it was a go-to chip on all a HP roadmaps after HP bought Compaq [in
2001],” pronounced one long-time integrator executive in a Boston area.

“Itanium was forward of a time and one of a large claims was it could run Windows, Linux and
Unix natively on a same hardware,” this integrator said. 

But steady delays to Itanium put server vendors — especially
HP
— in a tough spot.

Given Itanium’s teenager marketplace share, Oracle’s preference to abstain serve growth is not
surprising, he and others said.

Let us know what we consider about a story; email Barbara Darrow, Senior News Director during bdarrow@techtarget.com, or follow us on twitter.





Article source: http://www.pheedcontent.com/click.phdo?i=45631a447db9e9a5ab74e2cd3e84a387

Oracle drops Itanium development

Wednesday, March 23rd, 2011

Oracle will stop development for Intel’s Itanium chip.

More on Itanium

Itanium
processor delays
put HP’s plans in question

New Intel
Itanium processor
halts dead chip talk

Microsoft Windows
drops Itanium

In what may be the software giant’s least controversial move lately, Oracle said today that it
is cutting bait on Itanium software development.  With that move, it follows Red
Hat
and Microsoft, which last year said they would phase out Itanium support for their server
operating systems.

For example, Microsoft last April said Windows
Server 2008 R2
will be its last server OS to support Itanium.

Industry observers were not surprised at Oracle’s move. “It’s rather expected news. Itanium has
always been a low-volume processor.  It doesn’t surprise me at all that Oracle would rather
invest its research and development monies elsewhere,” said independent analyst Dan Kusnetzky of
Kusnetzky Group LLC.

HP offers several Itanium servers, which represent the majority of the Itanium-based segment,
and some observers cast this latest news in the context of continuing friction
between HP and Oracle
.

“This will probably piss off HP, but no one else has any skin left in this game as far as I
know,” said the CTO of a large Midwestern financial institution.

But Matt Eastwood, group vice president of enterprise servers at IDC, said Oracle’s decision
represents a double standard for Oracle.

Oracle CEO Larry Ellison “made a big deal saying that HP and Oracle need to stand by their
mutual customers, and now Oracle seems to be forcing those customers to make a choice,” Eastwood
said.

That decision may end up helping IBM more than Oracle, he added.  “IBM has the strongest
non-x86-based roadmap, in my opinion,” he noted.

HP probably makes 80% of the Itanium servers on the market, with the rest coming from Fujitsu
and Hitachi, Eastwood said.

The Itanium architecture, once also known as IA-64, was initially developed by HP and then
became a joint development project by HP and Intel. But in late 2004, HP relinquished its part in
developing and manufacturing the chip
, ceding that work, and its Itanium engineers, to
Intel.

Intel announced its latest Itanium chip, the
Tukwila
, last year, but popular support for the chip has flagged in the face of the Intel x86
behemoth. HP still offers several Integrity and Superdome servers based on Itanium.

Early Itanium promise faded

In the early part of the last decade, Itanium showed great promise. “People forget that Itanium
was once a very big deal — it was the go-to chip on all the HP roadmaps after HP bought Compaq [in
2001],” said one long-time integrator executive in the Boston area.

“Itanium was ahead of its time and one of the big claims was it could run Windows, Linux and
Unix natively on the same hardware,” this integrator said. 

But repeated delays to Itanium put server vendors — especially
HP
— in a tough spot.

Given Itanium’s minor market share, Oracle’s decision to forego further development is not
surprising, he and others said.

Let us know what you think about the story; email Barbara Darrow, Senior News Director at bdarrow@techtarget.com, or follow us on twitter.