Posts Tagged ‘Standard’

US HOT STOCKS: General Motors, OCZ Technology, Research In Motion

Friday, November 23rd, 2012

U.S. bonds sealed neatly aloft in Friday’s condensed session, as a Dow Jones Industrial Average gained 1.35% to 13010, a Standard Poor’s 500-stock index rose 1.3% to 1409, and a Nasdaq Composite jumped 1.38% to 2967.

CNH Global NV (CNH, $48.86, +$1.36, +2.86%) investors took heart after Fiat Industrial SpA (FNDSF) reliable Thursday that a cabinet of directors for CNH reacted agreeably to a honeyed offer to buy out a company’s minority …

Article source: http://online.wsj.com/article/BT-CO-20121123-708641.html

US HOT STOCKS: OCZ Technology Active in Late Trading

Wednesday, November 21st, 2012

U.S. bonds sealed somewhat aloft Wednesday, as a Dow Jones Industrial Average rose 48 points to 12837, a Standard Poor’s 500-stock index gained 3.2 points to 1391, and a Nasdaq Composite modernized by 9.9 points to 2927. Among a companies with shares actively trade after hours is OCZ Technology Group Inc. (OCZ).

Data-storage provider OCZ Technology disclosed it was underneath review by a Securities and Exchange Commission, after a association recently reported that it is reviewing the financial statements. Shares fell 6.7% to $1.11 after hours.

Fitch Ratings and Standard Poor’s Ratings Services downgraded their credit ratings on …

Article source: http://online.wsj.com/article/BT-CO-20121121-712840.html

PUE information centre potency metric to be stereotyped ‘within months’

Wednesday, November 21st, 2012

The Power Usage Effectiveness (PUE) metric, that measures how good a information centre uses a power, is set to turn an ISO customary within a subsequent year, according to The Green Grid.

Speaking during The Green Grid EMEA Forum in Brussels, Andr Rouyer, Industry and Government Alliances during Schneider Electric and EMEA Liaison Work Group Chair for The Green Grid, pronounced that standardisation is pivotal to a destiny success of a metric.

“It takes time to do this, there is a routine to follow. But PUE will be stereotyped in a few months,” he said.

PUE compares a sum appetite consumed by a information centre to a volume of appetite that indeed reaches a IT equipment, display how many is mislaid to other apparatus such as cooling systems.

While PUE has won support within a IT industry, there has not been a customary proceed to magnitude it, definition that operators can fiddle a sum in their foster if they select to do so.

This creates it formidable to review a appetite potency of one information centre to another, and also means that a metric can't be used as a indicate of anxiety in any central or authorised situation.

Efforts to conclude a common proceed of measuring and edition values for PUE, and get it recognized as a grave standard, have been carried out by a taskforce of tellurian leaders from government, attention and a non-profit zone given 2009.

The taskforce includes The Green Grid, a US Department of Energy, a Environmental Protection Agency’s Energy Star Program, a Uptime Institute and a US Building Council, among others.

The many new chronicle of a taskforce’s recommendations was published in May 2011 and is accessible here, and a initial full event with a ISO (International Standards Organisation) took place final November.

Now it seems that a efforts of a charge force have finally paid off. Once PUE becomes a standard, information centre operators will have to follow a really despotic set of manners on how appetite expenditure is measured. This should, in theory, make information centre PUE ratings mount adult to scrutiny.

Rouyer warned that standardisation is not a same as certification. Once a customary is in place, it will be adult to an outmost physique like KPMG to set adult a acceptance scheme, whereby information centres are audited and PUE ratings verified. However, standardisation is an critical initial step.

“In future, rather than referring to a metrics, we can magnitude PUE according to a ISO standard,” pronounced Paolo Bertoldi, Directorate-General of a European Commission’s Joint Research Center (JRC).

The news could have poignant implications in a UK, where a government’s CRC Energy Efficiency Scheme army vast organisations to guard their emissions and squeeze allowances for any tonne of CO2 constructed by a appetite they use.

Intellect, a UK tech industry’s trade association, has been campaigning on interest of a information centre attention for a supervision to extend a zone a grant from a CRC, due to a grant that it creates to shortening appetite expenditure in other industries.

The information centre attention hopes to secure a Climate Change Agreement (CCA) that acts as a carrot rather than a stick, charity a £22 remission for each tonne of CO saved. However, in sequence to secure a CCA, a attention needs to yield justification of a appetite power of information centre operations.

According to Andrew Donoghue, researcher during 451 Research, a standardisation of PUE could help.

“There isn’t an altogether proceed of assessing a appetite potency of information centres, so if a information centre attention is given a special grant underneath a CCA, how would they infer that they were being fit if there aren’t any recognized standards for doing that?” he said.

“At a impulse everybody uses PUE all over a shop, so we couldn’t form any kind of regulatory resource around that. Having a customary would make it some-more structured.”

Donoghue combined that CRC is on a final legs anyway, and is expected to be nice or thrown out wholly before long, though pronounced that a PUE customary could assistance a attention urge itself opposite any destiny CO taxation that comes down a pipeline.

“This is all about a attention perplexing to come adult with a possess metrics and ways of measuring itself, fundamentally to equivocate a top-down approach,” pronounced Donoghue. “The final thing they wish is a garland of polite servants to pull adult a customary exclusively and levy it on them.”

David Snelling, partner multiplication manager of Fujitsu’s Information Systems Division and clamp chair of The Green Grid’s EMEA Technical Work Group, combined that PUE will substantially be a initial in a array of metrics to be stereotyped as a proceed of traffic with a doubt of appetite and information centre usage.

“I consider a attention has finished really good with PUE, though it doesn’t residence that finish CO use or a sum appetite use issue,” pronounced Snelling.

“PUE is a right metric for a right time, though there is also a wider sustainability emanate that we’re looking at, and a metrics about CO use (CUE) and H2O use (WUE) start to constraint that forward-looking attitude.”

Last week, The Green Grid announced three new appetite potency metrics designed to assistance information centre owners and operators urge a opening of their facilities.

The Green Energy Coefficient quantifies a apportionment of a facility’s appetite that comes from immature sources, a Energy Reuse Factor identifies a apportionment of appetite that is exported for reuse outward of a information centre, and a Carbon Usage Effectiveness metric enables an comment of a sum hothouse gas emissions of a information centre relations to a IT appetite consumption.

The Green Grid pronounced a new metrics move a attention one step closer to a zodiacally adopted set of metrics, indices, and dimensions protocols.

Article source: http://www.pcadvisor.co.uk/news/green-computing/3412444/pue-data-centre-efficiency-metric-be-standardised-within-months/

Is ASHRAE TC9.9 a ‘Standard’?

Sunday, November 18th, 2012

The elementary answer is ‘no’, it is a technical guideline on Thermal Conditions for ICT hardware, released by a Trade Association – in this box a American Society of Heating, Refrigeration and Air-conditioning Engineers.  But it is during a unequivocally heart of a expostulate for appetite fit data-centres – zero else comes close.  It also has station approach above a American roots in that zero homogeneous (or even similar) exists in Europe or Asia-Pac.  That in itself is a problem for ‘real’ Standards outward of a USA: For instance in BSEN50600, a new Data Centre Infrastructure customary for Europe, we have been compulsory to safeguard that appetite potency enablement is a pivotal underline though we are not ‘allowed’ to anxiety ASHRAE – notwithstanding a fact that this is a usually source of information for what is, and what is not, excusable for heat and steam conditions of servers and a like.  Only by pulling that range (basically hotter with reduction steam control) can we grasp vast appetite assets on a ME side of a facility.
Of march one of a strength lies in fact that it is NOT a standard.  It can be, and has been, frequently updated – on a distant some-more visit basement than any inhabitant or general customary could ever dream of.   However a categorical strength of TC9.9 lies in a make-up of a contributors.  This is no remote, eccentric investigate physique revelation a attention what to do.  TC9.9 ‘is’ a attention and a contributors, a ICT hardware OEMs, move all of a investigate to a list – nonetheless not always to be common and, in one subject in particular, a outcome of high humidity, in wanting volume.
Recently we was astounded to review one USA attention suspicion personality on a LinkedIn forum observant that ASHREA had ‘let a attention down, unresolved onto parsimonious environmental controls to no advantage and not corroborated by any engineering research’.  In fact it was that statement, that we can’t unequivocally determine with, that stirred me to write this blog.  OK, we do determine that too most importance has been laid on a ‘recommended’ conditions and a ‘allowable’ ranges have not been good adequate described – and in an attention driven by paranoia (fear of a f-word!) rather than engineering that usually panders to a ‘safe-side’ and jobs-worth’ brigade.  However, after probing around TC9.9 byways and backwaters, we trust that we still have some approach to go before we unequivocally know a effects of high steam together with aloft temperatures and bad air-quality.  The information doesn’t exist.  The chemistry is simple, generally if outward atmosphere is drawn from an area with a vast physique of station H2O nearby, and decay and gnawing will, no doubt, boost if hardware is not rested in 3 years (you can collect your possess number, we am only restraining my series behind to Moore’s Law and ability gains) though a explanation doesn’t exist.
Overlay that onto a fact that a OEMs sell by a channel and have no wish to emanate any increase, no matter how marginal, in server disaster rates and we can see because unresolved onto a ‘recommended’ is elite by a infancy members of TC9.9.
But only demeanour during how distant we have come – and led by TC9.9:
• We have substituted 21-22°C CRAC lapse air-temperature that was constructed by under-floor atmosphere during 14-15°C) and enabled free-cooling to turn a existence rather than an outward atmosphere during 6°C newness during 5-10% of a year.   This redefinition over a heat and where it was to be totalled is a biggest ‘win’.
• We have gotten scarcely everybody to determine that 26°C estuary is not a risk and pushed free-cooling in vast tools of a universe to over 90%.  In fact with adiabatic mist a UK doesn’t need air-conditioning during all, even in fill-in if a ‘allowable’ boundary are used.
So, nothing can contend that they have not enabled outrageous stairs in appetite efficiency.  We could disagree about how quick they changed divided from a punch-card and read-write fasten conduct temp/hum specs of a 1960s mainframes though they are there now.
I also think that ‘they’, a ICT OEMs, don’t always determine and that a final published chronicle of a Thermal Guidelines is utterly a concede – we would like to be a fly-on-the-wall.
Mind we we am still confused by a A1-A4 classes of equipment, nonetheless a vigilant is clear.  The approach we see it, A1 Class is a commodity boxes that paint a bulk of a ICT (non-telecom) marketplace and A4 is a rather some-more strong and dear telecom things – NEBS spec in ‘merican-speak.  The A2 and A3 classes seem to report apparatus between a dual extremes that do not exist yet?  If a customer change on cost comes into play we note that energy-efficient servers haven’t taken a marketplace a charge due to a (relatively low) additional cost and that blade sales are not replacing pizza-box servers for a same reasons of economics – maybe a same will request to A2 A3?
I also hear that some protagonists within a EU CoC wish to import A4 into a best-practice beam though we don’t see that function simply on blurb grounds.  A4 is fundamentally no controls during all for all of Europe – 5-45°C and 5-95%RH (or something close, concentrating on dew-point avoidance).
What a destiny binds is interesting.  we think that TC9.9 won’t let lax a Allowable steam boundary though a lot some-more work and ancillary information and we also think that a information will uncover a 1-2% boost in disaster rate per year, each year.  But should we worry?  With adiabatic cooling we can grasp a same appetite potency with or though regulating fresh-air in a room and so take a doubt of a steam granted to a bucket totally out of a equation for those locations where lakes, sea or bad air-quality is nearby.  The Google and Facebook ‘open a window’ indication clearly works for them with their unequivocally brief refresh-rates though might not be so easy for a smaller comforts with 5-year+ tech refreshes?
 

Article source: http://www.datacenterdynamics.com/blogs/ian-bitterlin/ashrae-tc99-%E2%80%98standard%E2%80%99&u=4892

Is ASHRAE TC9.9 a ‘Standard’?

Sunday, November 18th, 2012

The elementary answer is ‘no’, it is a technical guideline on Thermal Conditions for ICT hardware, released by a Trade Association – in this box a American Society of Heating, Refrigeration and Air-conditioning Engineers.  But it is during a unequivocally heart of a expostulate for appetite fit data-centres – zero else comes close.  It also has station approach above a American roots in that zero homogeneous (or even similar) exists in Europe or Asia-Pac.  That in itself is a problem for ‘real’ Standards outward of a USA: For instance in BSEN50600, a new Data Centre Infrastructure customary for Europe, we have been compulsory to safeguard that appetite potency enablement is a pivotal underline though we are not ‘allowed’ to anxiety ASHRAE – notwithstanding a fact that this is a usually source of information for what is, and what is not, excusable for heat and steam conditions of servers and a like.  Only by pulling that range (basically hotter with reduction steam control) can we grasp vast appetite assets on a ME side of a facility.
Of march one of a strength lies in fact that it is NOT a standard.  It can be, and has been, frequently updated – on a distant some-more visit basement than any inhabitant or general customary could ever dream of.   However a categorical strength of TC9.9 lies in a make-up of a contributors.  This is no remote, eccentric investigate physique revelation a attention what to do.  TC9.9 ‘is’ a attention and a contributors, a ICT hardware OEMs, move all of a investigate to a list – nonetheless not always to be common and, in one subject in particular, a outcome of high humidity, in wanting volume.
Recently we was astounded to review one USA attention suspicion personality on a LinkedIn forum observant that ASHREA had ‘let a attention down, unresolved onto parsimonious environmental controls to no advantage and not corroborated by any engineering research’.  In fact it was that statement, that we can’t unequivocally determine with, that stirred me to write this blog.  OK, we do determine that too most importance has been laid on a ‘recommended’ conditions and a ‘allowable’ ranges have not been good adequate described – and in an attention driven by paranoia (fear of a f-word!) rather than engineering that usually panders to a ‘safe-side’ and jobs-worth’ brigade.  However, after probing around TC9.9 byways and backwaters, we trust that we still have some approach to go before we unequivocally know a effects of high steam together with aloft temperatures and bad air-quality.  The information doesn’t exist.  The chemistry is simple, generally if outward atmosphere is drawn from an area with a vast physique of station H2O nearby, and decay and gnawing will, no doubt, boost if hardware is not rested in 3 years (you can collect your possess number, we am only restraining my series behind to Moore’s Law and ability gains) though a explanation doesn’t exist.
Overlay that onto a fact that a OEMs sell by a channel and have no wish to emanate any increase, no matter how marginal, in server disaster rates and we can see because unresolved onto a ‘recommended’ is elite by a infancy members of TC9.9.
But only demeanour during how distant we have come – and led by TC9.9:
• We have substituted 21-22°C CRAC lapse air-temperature that was constructed by under-floor atmosphere during 14-15°C) and enabled free-cooling to turn a existence rather than an outward atmosphere during 6°C newness during 5-10% of a year.   This redefinition over a heat and where it was to be totalled is a biggest ‘win’.
• We have gotten scarcely everybody to determine that 26°C estuary is not a risk and pushed free-cooling in vast tools of a universe to over 90%.  In fact with adiabatic mist a UK doesn’t need air-conditioning during all, even in fill-in if a ‘allowable’ boundary are used.
So, nothing can contend that they have not enabled outrageous stairs in appetite efficiency.  We could disagree about how quick they changed divided from a punch-card and read-write fasten conduct temp/hum specs of a 1960s mainframes though they are there now.
I also think that ‘they’, a ICT OEMs, don’t always determine and that a final published chronicle of a Thermal Guidelines is utterly a concede – we would like to be a fly-on-the-wall.
Mind we we am still confused by a A1-A4 classes of equipment, nonetheless a vigilant is clear.  The approach we see it, A1 Class is a commodity boxes that paint a bulk of a ICT (non-telecom) marketplace and A4 is a rather some-more strong and dear telecom things – NEBS spec in ‘merican-speak.  The A2 and A3 classes seem to report apparatus between a dual extremes that do not exist yet?  If a customer change on cost comes into play we note that energy-efficient servers haven’t taken a marketplace a charge due to a (relatively low) additional cost and that blade sales are not replacing pizza-box servers for a same reasons of economics – maybe a same will request to A2 A3?
I also hear that some protagonists within a EU CoC wish to import A4 into a best-practice beam though we don’t see that function simply on blurb grounds.  A4 is fundamentally no controls during all for all of Europe – 5-45°C and 5-95%RH (or something close, concentrating on dew-point avoidance).
What a destiny binds is interesting.  we think that TC9.9 won’t let lax a Allowable steam boundary though a lot some-more work and ancillary information and we also think that a information will uncover a 1-2% boost in disaster rate per year, each year.  But should we worry?  With adiabatic cooling we can grasp a same appetite potency with or though regulating fresh-air in a room and so take a doubt of a steam granted to a bucket totally out of a equation for those locations where lakes, sea or bad air-quality is nearby.  The Google and Facebook ‘open a window’ indication clearly works for them with their unequivocally brief refresh-rates though might not be so easy for a smaller comforts with 5-year+ tech refreshes?
 

Article source: http://www.datacenterdynamics.com/blogs/ian-bitterlin/ashrae-tc99-%E2%80%98standard%E2%80%99&u=4892

Continuous monitoring: A square of a IT confidence puzzle

Friday, November 16th, 2012

Continuous monitoring: A square of a IT confidence puzzle

Continuous monitoring is replacing periodic acceptance of supervision information systems as a sovereign customary for IT security, though it is a means to an finish rather than an finish in itself, contend supervision confidence pros.

“Continuous monitoring is a tactic in a incomparable strategy,” pronounced Ron Ross, comparison mechanism scientist during a National Institute of Standards and Technology.

The incomparable plan is a extensive proceed to a flourishing series of vulnerabilities, threats and attacks targeting supervision systems, that have put information confidence on a Government Accountability Office’s list of high risk activities given 1997.

To be effective, a systems being monitored contingency be essentially sound, Ross said. Frequently checking a damaged close does not make it any some-more effective. “We can’t get ourselves out of this problem by counting things faster.”

Agencies contingency facilitate their IT environments by a use of collection such as cloud computing and craving architectures to make them some-more docile and afterwards deposit in a indispensable confidence to make them resilient.

Ross, who heads NIST’s Federal Information Security Management Act correspondence program, finished his comments during Symantec’s Government Security Symposium in Washington Nov. 7.

FISMA calls for agencies to guard a confidence standing of IT systems, though a sum of how to do it has been left to a Office of Management and Budget. OMB primarily determined a requirement for periodic confidence authorization, with acceptance and accreditation of IT systems finished each 3 years. With a gait of change in IT and in a cyber hazard landscape it has turn apparent that this is inadequate, however, and in a past 3 years a concentration has changed toward continual monitoring of systems as a deputy for triennial reauthorization.

Article source: http://gcn.com/articles/2012/11/16/continuous-monitoring-it-security-puzzle.aspx

Continuous monitoring: A square of a IT confidence puzzle

Friday, November 16th, 2012

Continuous monitoring: A square of a IT confidence puzzle

Continuous monitoring is replacing periodic acceptance of supervision information systems as a sovereign customary for IT security, though it is a means to an finish rather than an finish in itself, contend supervision confidence pros.

“Continuous monitoring is a tactic in a incomparable strategy,” pronounced Ron Ross, comparison mechanism scientist during a National Institute of Standards and Technology.

The incomparable plan is a extensive proceed to a flourishing series of vulnerabilities, threats and attacks targeting supervision systems, that have put information confidence on a Government Accountability Office’s list of high risk activities given 1997.

To be effective, a systems being monitored contingency be essentially sound, Ross said. Frequently checking a damaged close does not make it any some-more effective. “We can’t get ourselves out of this problem by counting things faster.”

Agencies contingency facilitate their IT environments by a use of collection such as cloud computing and craving architectures to make them some-more docile and afterwards deposit in a indispensable confidence to make them resilient.

Ross, who heads NIST’s Federal Information Security Management Act correspondence program, finished his comments during Symantec’s Government Security Symposium in Washington Nov. 7.

FISMA calls for agencies to guard a confidence standing of IT systems, though a sum of how to do it has been left to a Office of Management and Budget. OMB primarily determined a requirement for periodic confidence authorization, with acceptance and accreditation of IT systems finished each 3 years. With a gait of change in IT and in a cyber hazard landscape it has turn apparent that this is inadequate, however, and in a past 3 years a concentration has changed toward continual monitoring of systems as a deputy for triennial reauthorization.

Article source: http://gcn.com/articles/2012/11/16/continuous-monitoring-it-security-puzzle.aspx

MasterCard shows credit label with LCD shade and PIN

Sunday, November 11th, 2012



MasterCard’s unconventional Display Card credit label has been launched by Standard Chartered Bank in Singapore as a tellurian payments network attempts to conduct off a hazard acted to cosmetic income by rising technologies such as smartphones regulating near-field communication (NFC).

Featuring an embedded on/off button, LCD shade and a 12-digit PIN keypad, Display Card is a large step adult in confidence by a obsolete standards of today’s credit cards.

Users insert a label into a sales depot as normal, entering their chip and PIN pin number. Turning on a Display Card creates it probable for a complement to beget a one-time cue (OTP) that has to be entered immediately on a embedded keypad.

In outcome this combines a token reader record used by some banks currently with a earthy label itself, doing divided with a covering of inconvenience. It also creates it unfit for criminals regulating stolen credit label information to siphon income or products but carrying a label itself.

Because a complement is built in during a turn of a tellurian network, there is also intensity for it to be used by mixed banks and issuers by a singular card.

The downside is that a user has to enter dual sets of numbers when shopping items, that could means queues to delayed down. Banks and issuers will see that as value it as prolonged as a cards themselves don’t supplement to consumer costs.

In a future, consumers would also be means to use a embedded LCD to get information on their bank balances, faithfulness points and lists of new transactions, MasterCard said.

“MasterCard continues to be during a forefront of remuneration technology. From rising a initial ‘paper’ label in a 1950s, to introducing captivating stripes and EMV chips for secure, digitised payments, we are gratified to have been means to support a launch of Singapore’s initial Display Card by Standard Chartered,” pronounced MasterCard’s Matthew Driver.

The record for a complement was supposing by NagraID Security in partnership with Gemalto.

The Display Card has been around for a integrate of years, despite launched sensitively in out of a approach corners of a banking universe such as by BNP Paribas Turkish auxiliary TEB, that started charity a complement in Nov 2011.

Having used chip and PIN record given 2004, Europeans will knowledge a record as a two-factor prolongation of a informed principle; US users, still stranded in a uncertain signature age, competence find it some-more of a startle to a system. What is reduction transparent is how discerning a Display Card will strech mainstream users in these regions.

That could give opposition mobile banking vendors copiousness of support in their attempts to excommunicate plastic-based payments.

Article source: http://pcworld.co.nz/pcworld/pcw.nsf/news/mastercard-shows-credit-card-with-lcd-screen-and-pin

Amazon EC2 cost cuts vigilance IaaS land grab

Saturday, November 3rd, 2012

Amazon’s new era of Standard Amazon Machine Instances and reduced on-demand Linux prices
will flog off a new pricing fight for IaaS providers, experts say.

The new Standard
Amazon Machine Instances
(AMIs), introduced this week for Amazon’s Elastic
Compute Cloud
(EC2), follow Moore’s Law in adding estimate heft. Each has 50% some-more processing
power than a initial era and is optimized for applications such as media encoding, batch
processing, caching, and Web serving, Amazon said.

It is apparently a land grab.

Larry Carvalho, Robust Cloud LLC.

The association also done these cuts to a prices of first-generation Linux on-demand EC2
instances:

  • Small instances (up to 1.7 GB of memory) cut from 8 to 6.5 cents per hour
  • Medium instances (1.7 GB to 3.75 GB) cut from 16 to 13 cents per hour
  • Large instances  (3.75 GB to 7.5 GB memory) cut from 32 to 26 cents per hour
  • Extra vast instances (7.5 GB to 15 GB memory) cut from 64 to 52 cents per hour

At 18% to 20%, it’s a estimable decrease, though it’s also a 21st time Amazon has lowered its
prices, pronounced David Linthicum, arch record officer and owners of Blue Mountain Labs. The
difference this time is that Amazon and a determined competitors, such as Rackspace, are facing
new threats from emerging
IaaS companies
, some of that might come from abroad and could offer as most as 50% off
Amazon’s prices.

“It’s going to be a competition to a bottom,” Linthicum said. “Amazon as good as Rackspace and other
leaders in this space … [are] formulating detriment leaders to accumulate as most marketplace share as they can over
the subsequent integrate of years.”

The Amazon EC2 cost cuts seem odd, given that a association mislaid some-more than $200 million dollars
last quarter, pronounced Larry Carvalho, owners of consulting organisation Robust Cloud LLC.

“So it is apparently a land squeeze in terms of removing some-more people to use [the service],” he
said.

The Amazon cost cuts are also a pierce to get existent business to leave some-more of their static
infrastructure on Amazon Web Services, rather than relocating it to a colocation information core or some
other alternative, Carvalho said.

The new second-generation customary AMIs will be accessible in additional vast (15 GB) and double
extra vast (30 GB) sizes, that will go for 50 cents an hour and $1.16 an hour, respectively.

Beth Pariseau is a comparison news author for SearchCloudComputing.com and
SearchServerVirtualization.com. Write to her at 
bpariseau@techtarget.com or follow @PariseauTT on Twitter.




Article source: http://www.pheedcontent.com/click.phdo?i=a0bf46111a1bb8ceb9217329fea07f93

BYU-Idaho Professor Hacks Rexburg’s Network … On Purpose

Friday, October 12th, 2012

Some eastern Idaho lawmakers were left slack-jawed when a internal highbrow demonstrated how easy it was to penetrate into a City of Rexburg’s confidence systems.

The Rexburg Standard Journal reports that Brigham Young University-Idaho highbrow Steven Rigby took no some-more than 8 hours to penetrate his approach into a network, simply utilizing confidence access. Rigby told lawmakers that he could undo or revise files, change passwords and download viruses into a system.

The Journal reports that Rexburg has approximately 200 particular mechanism stations, upheld by 19 apart servers, though confidence on a complement has enervated over time.

Article source: http://www.boiseweekly.com/CityDesk/archives/2012/10/12/byu-idaho-professor-hacks-rexburgs-network-on-purpose