Sunday, August 31, 2008
www.csmonitor.com Copyright © 2008 The Christian Science Monitor. All rights reserved.
KNOwledge Within Life Is priZeless. Knowliz.com is a Technology Oriented Blog which contains news, information and articles based on Internet, Computer
Tuesday, August 19, 2008
Posted by: Anurag Bansal at 8:20 PM
In one of my previous posts, I mentioned about couple of options by which you can share files with family and friends. Out of that list, some of the options let you upload any kind of file and your files remain on their server forever.
I am listing some of the options which I tried and would definitely recommend anyone interested. For complete review and restrictions, read my previous post on the same topic.
Please click here to continue reading this article.
Thank you to Anurag Bansal for permission to post his research and for the valuable content on his blog!
Mystical lessons within the clouds.
by Rabbi Boruch Leff
When was the last time you asked a question with childlike wonder, like "Why is the sky blue?" If it's been a while, let's try one now.
Why did God create a world with clouds?
Scientifically, we could answer that clouds consist of water that has evaporated, then condensed into vapor. When these vapor particles combine and become heavy enough, they will fall as rain. As described by meteorologist, Jeff Pardo, clouds "help regulate the earth's energy balance, by reflecting and scattering solar radiation or absorbing the earth's radiated infrared energy. Clouds maintain the earth's atmospheric stability because clouds form when air rises and cools. When a blob of air goes up into an area of less pressure, it cools. When it reaches its dew point temperature, the rising parcel is no longer unsaturated. Water begins to condense, and it then rains."
This is why the Talmud states that God never withholds clouds from the world -- they are required constantly for the world's existence. (Taanit 3b)
But there are deeper mystical lessons contained within the clouds. The verse states "God covers the heavens with clouds and prepares rain for the earth" (Psalms147:8).
The onset of heavy, dark clouds may appear somewhat menacing, but it is really nothing of the sort. God brings the clouds and fills them with rainwater, bringing tremendous blessing to the world. The lesson is clear. God often send us worries and troubles but in the end we come to understand that the purpose of the ordeals was to carry us to great achievements.
In a deeper vein, Kabbalists suggest that there are times when God brings clouds to block the sunlight for ecological purposes. Similarly, there are times when God presents blockades to spiritual success for certain purposes. If a person stares directly into the sun, his eyesight is temporarily impaired, and prolonged exposure would lead to vision damage. This is why we cannot tolerate staring into the sun.
We experience this in the spiritual realm as well. There are times when jumping to great spiritual heights too quickly is damaging to our growth. One who takes on too much too soon can easily burn out and, in the end, regress.
Stable growth needs to happen gradually. Taking on too much, too fast, most often doesn't last. This is one of the reasons God redeemed the Jewish people from Egypt in stages; releasing oneself from an idolatrous Egyptian culture cannot be done overnight.
Spiritual growth requires patience and consistency. Remember, it doesn't matter how high up you are on the spiritual ladder, as long as you are moving up.
Perhaps this is why God blocks the sun with clouds. Ultimately the clouds purpose it to produce the blessing of rain; in spiritual terms the clouds remind us to aim for permanent, steady growth, one step at a time.
Clouds act as a type of barrier, letting us know that there are levels that are presently beyond us and we shouldn't leap to heights we're not yet ready for. But when we do climb the ladder and finally reach the clouds, we see that they have no strong substance to them. You can fly right through them! Clouds are a mirage, they are not real obstacles.
The message is clear. Once we start to grow spiritually and embark on the path toward heaven, we should not be intimidated by the obstacles, the clouds that lie before us. They are an only an illusion. Just keep soaring and you'll pass right through them.
Site contents copyright © 1995 - 2008 Aish HaTorah
Saturday, August 30, 2008
Sharing some of the ingredients & spells from my Grimoire …
Recently, we have been seeing lots of energy-efficient products entering our lives, the change has been triggered by factors like rising oil prices, increase in environmental pollution and soaring electricity costs. There has been a significant shift towards energy efficient computers, especially in the data centers.
Centaur semiconductors have come up with a very interesting way of evaluating the energy efficiency of micoprocessor products. It is called the TreeMark™ Tree Rating. Centaur defines it as “The number of trees that need to be planted to counter the amount of carbon dioxide created as a by-product of the electricity generated to power the processor over its operational lifetime”. The good thing is, Centaur’s processors rate very well on this benchmark when compared with their Intel or AMD counterparts.
I still keep on wondering what would be the TreeMark rating for the components other than the processor in a computer. If a processor’s TreeMark rating is 7 trees, the rest of the electronics in the computer could raise the bar by 14-20 more trees. Not to mention the amount of thermal cooling required in the datacenters.
The situation demands more involvement in the process of creating energy efficient compute appliances on part of the industry leaders. I would like to mention the good work done by IBM for the Power6 Processor.
The figure below shows how the TreeMark Tree rating can be calculated for an electrical load.
Copyright 2007. Electronics & IT Grimoire. All rights reserved.
Published on WSO2 Oxygen Tank (http://wso2.org)
by Ayanthi Anandagoda
Table of Contents
- Cloud Computing
- Software as a Service (SaaS)
- Hardware as a Service (HaaS)
- Platform as a Service (PaaS)
- at Cross Roads..
Cloud computing essentially encapsulates the following three concepts: pay-as-you-go, on-demand and on the Net. It is the computing model of the day, where the use of IT is billed like a utility and hence the term utility computing. Cloud computing is a part of the on-demand model for computing that allows companies to focus on creating true business value rather than delve on setting up and maintaining IT infrastructure to get going.
How do we then, relate cloud computing to SaaS, PaaS and HaaS? As you dig deeper it becomes apparent that SaaS, PaaS and HaaS are different categories of cloud computing. Where SaaS refers to applications in the cloud, PaaS refers to the platforms in the cloud while HaaS refers to the infrastructure in the cloud.
2. Software as a Service..
The concept of 'Software as a Service' or SaaS, is an application delivery model with vendors hosting Web-based applications on the Internet and consumers consuming them on line. Technologies such as Web services and REST  play an integral role in the development of SaaS applications.
The origins of the popular CamelCase term goes back to a white paper published by the Software & Information Industry's eBusiness Division in 2001 titled "Strategic Backgrounder: Software as a Service" , in which they discussed, “the delivery, management and payment of software by Application Service Providers (ASPs) as a service rather than a product..” with users 'subscribing' for the use of software rather than purchasing them upfront.
Despite the hype back in the days, the ASP model failed to take off as anticipated by the originators largely due to the unavailability of network and Web being untested grounds for many. Today, tables have clearly turned with Internet access at lightening speeds, increased habitats on the Web and the increased adoption of open standards.
Web 2.0 addressed largely the Web-based socializing and consumer-oriented software domain but not SaaS. The list of vertical industries qualifying for SaaS application model include Customer Relationship Management (CRM) systems, Supply Chain Management (SCM) systems, Human Resources Management (HRM) systems, video conferencing, accounting and various others.
SaaS presents a number of licensing and pricing models for the vendors to choose from that includes pay-as-you-go, subscription-based, revenue-based, usage/transaction-based and other. Some even go as far as offering complete services free of charge preferring to monetize with ads only.
3. Hardware as a Service..
The concept of Hardware as a Service' or HaaS refers to, the virtualization of the data center. It appears to be that HaaS provides the real estate support while SaaS provides the application functionality in the journey towards cloud computing supremacy.
With striking similarities to hardware leasing, HaaS is a model in which the vendor manages the lease as opposed to the customer that helps keep service calls to an absolute minimum.
Earlier this year, Apple announced its slightly different flavor of HaaS that promises to improve old hardware with software upgrades an act that was already adopted by Microsoft on Zune devices, Nintendo on wii and Sony on its BlueRay playback on the Playstation.
4. Platform as a Service..
In the next logical evolution of computing in the cloud comes as an integrated platform to build, test, and deploy custom applications that we called the Platform as a Service (PaaS).
The concept of 'Platform as a Service' is a form of cloud computing delivers development environments as a service rather than offering full-blown applications. Pioneered by Amazon, Google already has an offer with the Google App engine, where you can sign up for a free account use up to 500MB of persistent storage, CPU and bandwidth for about 5 million page views a month. Sun is on its way with their offering of Platform as a Service in what they called Project Caroline.
- Utility Computing refers to making computing resources available as a metered service.
- Grid Computing refers to an infrastructure in which networked computers are able to access and utilize the power of one another
- Elasticity refers to the ability to dynamically acquire or release computing resources on-demand
SaaS is predominantly looked at as an application delivery model, as opposed to the concept of SOA (Service Oriented Architecture) - an architectural strategy weaving together services to create business processes. So, what have they got in common?
On one side, the flexible and scalable pedigree of SOA brings value to SaaS - in terms of loosely coupled, contracted services that empowers SaaS providers more efficiently compete in the marketplace against packaged, on-premise software vendors in terms of price, flexibility and other service quality offerings. As demand for scalability and flexibility mounts, SaaS can only serve in the short term without SOA offerings to enable them optimize the construction and operation of SaaS services for the long run.
On the other side of the equation, increasingly many enterprises expect SaaS be made available for their SOA implementations without getting bogged down in development. As we see, although the initial impressions of the cloud computing model was all about delivering software, the transformation is far more fundamental and deep routed in SOA.
The intersection, has been inevitable. The two forms have converged and have already begun to fuse great possibilities for the enterprise.
Is SaaS, cloud computing and PaaS serious enough for the enterprise to build and deploy business applications?
- virtualization of computing power
- on demand service coupled with a pay-per-use business model yielding to economies of scale
- increased scalability
- ability to leverage power of SOA
- reduced startup times specially for the Small Office Home Office customer(SOHO), who no longer has to bear all the costs of infrastructure and maintenance.
- single point of accountability
- rich application functionality at dramatically low costs
- removes the need to over-buy in terms of "safety net" capacities to handle periodic traffic spikes
Cloud computing seem convincing enough. Buy how about security and privacy?
The concern inevitably raises the question "how much do we need to know about the services we acquire or consume from other sources?To this point I'd say that the companies offering cloud computing services will live and die by their reputations. As cloud computing leaves users to feel that they lose a degree of control over their often-sensitive information - it would be for the cloud operators to convince otherwise.
There are addtional concerns that include concerns of how well the popular pay-as-you-go payment model is defined, as services consumption variables become complex with tiers of service constraints being added.
- Amazon Web Services: “..capitalizes on Amazon's combination of computational skills and operational savvy. It piggybacks on a multi billion-dollar IT infrastructure. And it pulls in a whole new category of customers looking for rock-solid scalable computing on demand — blue-chip startups like Zillow and PowerSet, kids in garages building the next Google, even adventurous corporate IT jocks looking to offload some of the drudgery.” 
- Recently announced IBM's cloud computing offering dubbed 'Blue Cloud' is expected offer Enterprise Data Center facilities to the financial services.To have Oracle endorsing such a concept is a landmark in itself. Cloud computing offers an economic advantage of being able to leverage the provider’s shared infrastructure without having to own the major cost of directly supporting a widely distributed user base. the case for doing it yourself is progressively getting weaker and weaker. complementary synergies between these two powerful software approaches Web 2.0 offers a face to SOA:
- The Sun Grid Compute Utility: Based on open-source technologies such as Solaris and various Java technologies, Sun Grid is an on-demand grid computing service operated by Sun Microsystems.
- Sales Force: Founded in 1999 by Marc Benioff (an ex-Oracle employee) the company is headquartered in San Francisco, California and is 'The Leader in On-Demand Customer Relationship Management (CRM)'. Its AppExchange allows external developers to create extensions that'll link them into the core Salesforce.com system. These extensions typically include varieties of sales and financial tools.
- Oracle aims to package a range of Oracle products into a coherent platform that ISVs can build their SaaS offerings on. Just a little over a month ago Oracle launched their latest on-demand CRM release - Oracle CRM On Demand.
- Microsoft's Live Mesh with design goals of unified device, data and application management.
- Netsuite hosted online business software programs include accounting, customer relationship management (CRM), enterprise resource planning (ERP) software, e-commerce and Web site development.
The on-demand model is moving everything from software applications to processing power to storage and APIs from desktops and organizational data centers to the cloud. The obstacles are more in the lines of security and privacy. Cloud operators are expected to prove themselves against rival hackers. Despite the concerns, however, the utility-style pay-by-the-drink pricing computing trend will no doubt change the society as profoundly as cheap electricity did centuries ago.
10. References & Resources:
- RiGHTSCALE Blog - Define Cloud Computing
- Strategic Backgrounder: Software as a Service
- Web 2.0 - Beyond the Conference..
- Macworld Confirms Growing Trend of 'Hardware as a Service' - WIRED
- Project Caroline - Platform.. as a Service
- Service Oriented Architecture - an Overview
- Cloud Computing. Available at Amazon.com Today - By Spencer Reiss
- Google App Engine - Run your web applications on Google's infrastructure.
- The Big Switch - Nicholas Carr
© 2008 WSO2 Inc.
By Ina Fried
Staff writer, CNET News
August 29, 2008, 4:00 a.m. PDT
Editors' note: This is part of a series exploring computing in Latin America.
CAMPINAS, Brazil--A math teacher gives a class of eighth-graders their assignment and tells them to get to work.
The students grab their bags and fan out across the campus, enjoying the sunny autumn day. Sitting in groups of three and four, some at tables and some on the ground, the students work on the day's lesson. None of the students are using books or writing on paper. Instead, in each student's hands is a small blue-and-white computer that acts as both textbook and notebook.
Click here to read all of the blogs in The Borders of Computing series.
The computers are Intel's Classmate PC, and each one of the students at the Bradesco Foundation school here has one to use each day. As the largest one-to-one computing project in Latin America it's being closely watched. School officials say there is more at stake than the reputation of the Classmate PC, however.
"We have to tread carefully," said Vice Principal Tania Maria Gebin de Carvalhao. "You can't have a recall of students and say 'wait, we did it wrong, come back.'"
The stakes are also high for the technology companies involved. Intel and Microsoft hope to show not only the power of giving laptops to students--but also to show the world that they too have a product in this area--with so many headlines in the U.S. focused on Nicholas Negroponte's One Laptop Per Child project (Microsoft, more recently has started working with OLPC as well).
One of the keys is knowing when to use computers and when not to us them. In chemistry, for example, it's important that students have the hands-on experience they get by mixing chemicals in test tubes.
Eighth-grade math students work at Intel Classmate PCs in an outdoor classroom at the Bradesco Foundation school in Campinas, Brazil.
"The lab is still good," Gebin de Carvalhao said, but the computers have also come in handy, such as if a teacher wants to demonstrate an explosive reaction.
"Sometimes for safety reasons, it's better not to do it in the lab."
Although it is traditional paintings and not PowerPoint illustrations that hang on her wall, art teacher Elaine Barreiros has also found the computers to be a valuable addition to her classroom.
On this day, she has about two-thirds of the class researching the dress of different ethnic groups while a third of the kids have set their laptops aside and are carving wax sculptures based on their research.
Elaine shrugs off the notion that computers might get in the way. Pointing to the current project, she notes that many of the students will never have the opportunity to travel to all of Brazil, even. "This is the best resource we have," she said, pointing to a laptop. "They can travel the world."
In teaching geometry class, Paulo Cesar Mucci uses an electronic whiteboard to show how a protractor works, noting that the technology makes it possible to see every degree, something that wouldn't be the case if he had to hold up a protractor or draw one by hand.
Unlike in some other one-to-one programs, the Bradesco students don't get to take the laptops home each night.
There are two main reasons for this. First of all, when this group of students heads home in the afternoon, the laptops' day is just beginning. Students in Brazil only go to school for four hours a day, meaning the school is able to offer three shifts of classes: morning, afternoon, and night. As a result, the laptops can do triple duty, even with each student having his or her own laptop throughout the day.
Even if they had more laptops, they still wouldn't send them home, though. Administrators would be worried about the laptops making it back to school. It's not that they think the students would mistreat or misplace the laptops.
"They might get mugged," said the school's principal. Because of where the students live, "it's still not safe."
Copyright ©2008 CNET Networks, Inc., a CBS Company. All rights reserved.
Friday, August 29, 2008
SAN JOSE, California (AFP) — Lifelike graphics are breaking free of elite computer games and spreading throughout society in what industry insiders proclaim is the dawning of a "visual computing era."
Astronauts, film makers and celebrities joined software savants, engineers and gamers in the heart of Silicon Valley this week for a first-ever NVision conference devoted to computer imagery advances changing the way people and machines interact.
"Visual computing is transforming the videogame industry; transforming the film industry, and has all kinds of potential for how we view real-time television," NVIDIA co-founder Jen-Hsun Huang told those gathered at the event.
"We solve some of the most challenging problems for more and more companies around the world. Let the era of visual computing begin."
Gamers dueled for three days in a cavernous room in the San Jose Convention Center while entrepreneurs showed how graphics breakthroughs are shining in other fields.
Car makers are exploring letting potential buyers not only customize automobiles with graphics software but go on virtual test drives.
Graphics processing underpins financial modeling and weather forecasting.
Israel-based Optitex demonstrated software that replicates fabrics so realistically that clothing designers can see what fashions will look and act like on people before garments are made.
Optitex's animation software is being eyed by Hollywood film makers.
Dassault Systemes puts 3D computer-assisted design to work virtually constructing passenger jets, buildings and more.
"Three-D should be a new way for us to dream and design the future of our world," The French company's chief executive Bernard Charles said at NVision.
"It will impact everything we do: education, science, talking to each other ... of course games."
He predicts that lifelike graphics combined with feedback from online communities will let people influence how products are designed, sold and even how "green" they are.
Charles maintains computer simulations will be so realistic that virtual activities will mirror physical experiences.
Simulators already play an important part in training for space shuttle missions, according to former US astronaut Eileen Collins, the first woman shuttle commander.
"When you fly the actual mission you feel like you are in a simulator," Collins said. "We really can't do our job without the good visual graphics that we get."
The world of visual computing is "inescapable," said Chris Malachowsky, a co-founder of NVIDIA, a California firm renowned for high-end graphics processing cards for computers.
"We are being presented with displays everywhere," Malachowsky told AFP. "It used to be about the computing part, but the emphasis is shifting. It is not so much about the computation but how it is presented and seen by people."
The rising tide of digital videos, photos, films and television shows on the Internet is lifting the status of graphics chips, cards, and software and strengthening a trend to "unflatten" displays with 3D imagery.
Malachowsky spoke of using visual computing power to develop new medicines or provide doctors with real-time 3D images of patients' organs.
"They will be able to recreate scan data so fast you could see your own heart beating," Malachowsky said.
"This is being subsidized by all these kids out there playing games."
Perceptive Pixel founder Jeff Han, referred to by some as "the father of touch screen" computing, maintains graphics opens up user interface control possibilities that could render a "mouse" obsolete.
Han demonstrated touch-screen technology that lets several people simultaneously manipulate applications and files on a single large monitor.
"It's not personal computing anymore," Han said. "It's visual computing."
Battlestar Galactica bombshell Tricia Helfer praised computer animation innovations that enable the science fiction television series to rivet viewers.
Helfer plays a part-machine, part-organic Cylon character called "Number Six" that has turned on its creators.
"It's a bit threatening," Helfer said of technology promising to one day make animated characters indistinguishable from real actors.
"But the advantages and uses of it are amazing, but it is something we are going to have to get used to."
The key advantage of cloud computing isn't performance or scalability – it is portability
Saying that your business should never, never, ever use cloud-based applications instead of desktop or network/server based ones is about as ridiculous as saying that cloud-based applications will eventually replace IT completely. Mostly cloud computing is a way to provide an application at low startup costs in exchange for revenue over time.
With an article that begins with "Cloud computing apps are for suckers. If there is an alternative that runs locally on your own machine, it will always be better," John C Dvorak, seems to be going from "baiting Mac users" to "baiting Google users."
But let's just take the argument at face value. Some of the points he makes are good ones – specifically, the ones with performance issues.
I don't care if you have 30-megabit-per-second service, you'll get flaky performance from most online apps, especially if they're popular. Always remember that your online speed is only as good as the speed at which data is coming at you: The application server may be swamped, and the various nodes along the route could become clogged, too. Nothing is ever as fast as the machine sitting on top of (or beneath) your own desk.
Your desktop is faster than the cloud – that's true - but is your car? Information stored in the cloud can be accessed from any place with a Net connection. Information stored locally can only be accessed locally – well, unless you connect through a VPN or set up a VNC server. But even for those of us that know how to do it, a VNC server is a hassle, and a security risk unless you do it exactly right. 90 minutes is horrendous downtime for an enterprise application, and Dvorak is right so far as any application where 90 minutes downtime is unacceptable shouldn't be put on the cloud.
But there are plenty of applications – and for small-to-medium companies, e-mail is one of them – where the losses incurred from 90 minutes of downtime is less than the cost of having a dedicated in-house application installed and maintained on the network. (If the opposite is true, don't use cloud computing, use the in-house application, and keep an eye on how it performs.)
Dvorak also points out that your data is at the mercy of the service provider and that if the service is cut off, for whatever reason, so is your data. That's true, but if you don't back-up your data, your data can be lost by a hard drive crash. Both are about as likely to happen, in my experience.
To Dvorak, "People tend to forget that software is NOT a service; the whole cloud scheme is a scam to lock users into a single product and somehow extract more money from them." There is some aspect of vendor lock-in, but mostly cloud computing is a way to provide an application at low startup costs in exchange for revenue over time – whether through advertising, in the case of Google's apps, or through a subscription model. Yes, it is very much "renting" rather than "owning," but that can very well make financial sense in many cases.
After that, the arguments get a bit silly.
What happens if the net is attacked and your entire cloud world is gone for days and days? It just happened in the Republic of Georgia, and it can probably happen anywhere.
If the Russians start bombing us, John, I'm sure that the boss will give us a few days off.
Ask yourself why the heck will we need six-core, high-performance chips if the cloud takes over everything?
Why do we need six-core, high-performance chips now? In a virtualized server, certainly we'll need power to spare, but unless you're doing video editing or animation rendering, a six-core chip is probably overkill. And if we stop putting the big iron in the datacenters of big companies (very unlikely), they'll pop up in the data centers of the SaaS providers.
When it comes to performance and scalability, absolutely, standard client-server IT applications and local programs are going to have SAAS beat. Final Cut Pro is not going to the cloud. Photoshop isn't going to the cloud (though Photoshop Elements is...). But the key advantage of cloud computing isn't performance or scalability – it is portability. This is why people will pay twice as much for a laptop with the same specs as a desktop computer. Mobility is important.
Copyright © 2008 SYS-CON Media. All Rights Reserved.
About Brandon Wybenga
Brandon Wybenga is the System Administrator for DataPros for Healthcare, a data cleansing and consulting company based out of Tampa, Florida. He is also attending ITT Technical Institute, Tampa, earning his Bachelor of Science in Information Security Systems.
Posted by Roger Smith, Aug 28, 2008 05:28 PM
Many people don't like the concept of "private clouds," including my colleague John Foley and Sam Johnston ("The case against 'private clouds' "), since by definition cloud computing involves letting people plug into shared IT services in data centers that aren't their own. As oxymorons go, though, private cloud computing doesn't strike me as particularly egregious: I would probably rank it halfway between 'green data center' and 'business intelligence' on my own (admittedly moronic) oxymoron scale.
I discussed the cloud computing ecosystem earlier this month with Sam Charrington, VP of product marketing and management for Appistry, a maker of middleware that helps applications run smoothly in a cloud environment, after his LinuxWorld Expo Cloud Computing session. Charrington's view of the future of cloud computing includes Google (NSDQ: GOOG)-like public clouds as platforms for applications; virtual private clouds, which are third-party clouds, or segments of the public cloud with additional features for security, compliance, etc. (for HIPPA medical record compliance or SOX accounting standards, for example); as well as private or internal clouds, which are an extension of virtualization and used primarily because of their capital or operational efficiencies. His formulation seems to me to make sense since it also dovetails with my view of cloud computing as a natural evolution from the grid/utility computing model, which is the delivery of storage, computation and other computing resources as metered services similar to the way traditional public utilities deliver electricity.
A private cloud, by analogy, is computing capacity produced "off-the-grid" similar to the ways some homeowners produce electrical power with renewable energy sources such as solar arrays on their roof or windmills, and therefore have the option of using it themselves; selling it back to a centralized grid; or allocating it to anyone they choose. Ultimately, anyone with a data center will conceivably be able to provide cloud services, as long as those services conform to a set of cloud infrastructure standards, most of which have yet to be defined.
Sam Johnston ("Cloud standards: not so fast...") is one of many who say cloud standardization efforts are premature. Johnson points to the market-driven ecosystem that has sprung up overnight aroundAmazon (NSDQ: AMZN) Web Services as an example of what kind of cloud standards are needed, namely "simple, rugged, market tested interfaces defined by the innovators in each area (virtualization, storage, services, etc.)." I tend to agree that much of the cloud standardization effort at the moment seems to be putting the cart before the horse, although I'm intrigued by the possibility of leveraging some of the work that's been done on the new Open Virtualization Format (OVF), created by the Distributed Management Task Force (DMTF) standards organization. OVF is a platform independent, efficient, extensible, open packaging and distribution format for virtual machines that allows virtualization packaging, distribution, installation, and management -- all within an archive or Tar file such as "myApp.ova," which can include a digital signature for security. OVF can be compared roughly to the MP3 digital music format that is used to encapsulate music information. A packaging format that is vendor-neutral, it allows virtual machines, or sets of virtual machines, to be installed on any platform including public, virtual private, and private clouds. OVF is based on the DMTF's Common Information Model (CIM), which would make a good starting point for a cloud API. (If you're not familiar with CIM, it's an open standard that defines how managed elements in an IT environment are represented as a common set of objects and relationships between them. This is intended to allow consistent management of these managed elements, independent of their manufacturer or provider.) This seems like as good a blueprint as any for cloud standardization, although I would like some safeguards to assure the separation of public, virtual private, and private clouds. Off-grid private clouds should be able to be autonomous in much the same way that off-grid homes can generate electrical power on-site with renewable energy sources such as solar or wind; with a generator and adequate fuel reserves; or simply done without, as in Amish communities.
Sam Johnston ("Cloud standards: not so fast...") is one of many who say cloud standardization efforts are premature. Johnson points to the market-driven ecosystem that has sprung up overnight aroundAmazon (NSDQ: AMZN) Web Services as an example of what kind of cloud standards are needed, namely "simple, rugged, market tested interfaces defined by the innovators in each area (virtualization, storage, services, etc.)." I tend to agree that much of the cloud standardization effort at the moment seems to be putting the cart before the horse, although I'm intrigued by the possibility of leveraging some of the work that's been done on the new Open Virtualization Format (OVF), created by the Distributed Management Task Force (DMTF) standards organization. OVF is a platform independent, efficient, extensible, open packaging and distribution format for virtual machines that allows virtualization packaging, distribution, installation, and management -- all within an archive or Tar file such as "myApp.ova," which can include a digital signature for security.
OVF can be compared roughly to the MP3 digital music format that is used to encapsulate music information. A packaging format that is vendor-neutral, it allows virtual machines, or sets of virtual machines, to be installed on any platform including public, virtual private, and private clouds. OVF is based on the DMTF's Common Information Model (CIM), which would make a good starting point for a cloud API. (If you're not familiar with CIM, it's an open standard that defines how managed elements in an IT environment are represented as a common set of objects and relationships between them. This is intended to allow consistent management of these managed elements, independent of their manufacturer or provider.) This seems like as good a blueprint as any for cloud standardization, although I would like some safeguards to assure the separation of public, virtual private, and private clouds. Off-grid private clouds should be able to be autonomous in much the same way that off-grid homes can generate electrical power on-site with renewable energy sources such as solar or wind; with a generator and adequate fuel reserves; or simply done without, as in Amish communities.
Thursday, August 28, 2008
Posted in Uncategorized by storageoptimization on the August 28, 2008
Image: PC Magazine
Good old billg has something to say in his “exit interview” about storage in the cloud in this week’s PC Magazine. In essence, his view is that computing and storage will move to the cloud at different rates, and that storage is the more logical thing to move first. Your local storage (presumably on your Windows PC in Mr. Gates’ worldview) will be a cache of a subset of the master data held in the cloud.
Moving data in to the cloud makes a lot of sense, as it makes that data available to computers everywhere, and it also centralizes management of data for backups, geo-replication, and hardware refresh in places where economies of scale can take place that an average user or company could not manage or afford.
The other place is in the data center of the storage cloud provider. That’s going to be a very competitive marketplace, and the cloud vendors that can charge you the least amount per Gigabyte – to store, to transfer, to replicate –are going to have the competitive advantage. So the cloud vendors that do the best job of integrating storage optimization in to the cloud in a transparent way will have the edge. And the cloud is a great place to get that edge.
Think of deduplication, for example. If you deduplicate songs just in your own house, well you may only have one copy of each song. Why would you have ten copies of a Britney Spears song? (I might ask why you would have any at all … but we’ll leave that for another time.) However, if 5 million people store their data at a cloud storage provider, how many copies of that hit song might end up there? Does the cloud provider need to store 5 million copies of the same thing? No. If they do, they are being very inefficient. A song is a simple example, but even with enterprise data, the more data you have, the more likely it is that you’ll find patterns, correlations, duplicates, or data relationships that can be exploited for better compression. So the cloud offers an opportunity for efficiency that don’t exist at each little pool of local storage on your hard drive today.
To me, storage optimization and the move of storage to the cloud make a perfect match.
Storage Optimization aims to provide an objective look at the fast-changing world of storage. The blog was started by Carter George, co-founder of Ocarina Networks and provides regular commentary, including guest posts from industry leaders, customers and influencers, on how storage innovations are helping to shape the future of business.
by Cyrus Farivar
Morning Edition, August 27, 2008
STEVE INSKEEP, host: Poor people in developing countries have been the focus of ambitious projects, like this one aimed at bridging the world's digital divide. A non-profit called One Laptop Per Child made a promise three years ago to provide $100 computers to millions of children. The group has achieved only a fraction of that goal. But as Cyrus Farivar reports, One Laptop Per Child has still made its mark on the global computer industry.
CYRUS FARIVAR: There's only...
NPR (National Public Radio) is an internationally acclaimed producer and distributor of noncommercial news, talk, and entertainment programming.
Copyright 2008 NPR
Businesses must look at the whole picture when implementing green strategies
Written by David Tebbutt
Computing, 24 Jul 2008
Green computing is an admirable objective – but because the strategy focuses on computing itself, it often fails to consider the wider environmental problems that face the world.
Even labelling the green computing issue as “environmental” does not really do the trick. The better approach is to start with “sustainable development” and work back from there.
According to the 1987 Brundtland Report, also known as Our Common Future, sustainable development “meets the needs of the present without compromising the ability of future generations to meet their own needs”.
The report took an international view of sustainability and applied it to employment, food, energy, water and sanitation. We sometimes forget that if our computer equipment is made in China, its water and ground pollution largely stays there, while the gaseous emissions are shared with the whole world.
The only truly environmental way of accounting for our choices is to look at the lifecycle impact of what we buy and throw away. That includes everything from raw materials through construction, packaging, delivery, use and disposal.
Anything less than this is just playing at being green.
Of course, such an approach is also inconvenient. It is very difficult to discover the carbon footprint of what we buy – it is much easier to find out how much energy something uses for our base calculations.
As such, you rarely hear IT vendors talking about embedded carbon and other pollution in the products they try to convince IT managers to buy. And it is why governments are so keen to focus on pricing carbon and setting targets.
Some 5,000 British companies will be obliged to sign up for the Carbon Reduction Commitment in 2010, which will make businesses measure their carbon emissions to see how they compare year on year. League tables will be published and some firms will be rewarded with bonuses, while others will be punished with fines.
The commitment will focus minds. But companies will analyse their emissions – including those they inherit with their energy supplies – and may decide to offshore polluting elements of their work. The result will be a nice, clean UK operation and no reprimands for embedded carbon. As a result, the company in question might be obeying the letter of the law – but hardly its spirit.
So, what will drive companies to do the right thing? Money and regulation are top of the list. Brand value, corporate social responsibility and public relations are all closely intertwined.
Many companies need to be seen to be environmentally sensitive and if you work for such a firm, life will be relatively easy. These businesses concentrate on working with management and examine every corner of the business to discover opportunities to be greener.
However, if money and regulations drive the firm, every green decision needs to be costed. If regulations are involved, penalties are usually not far away.
As such, the equation still boils down to money – and you are wasting your time trying to appeal to the firm’s better nature. Sooner or later, though, the company will come under customer or shareholder pressure to act greener, especially if it is part of a supply chain to more committed businesses.
Centre attention on the areas that will make a difference but require little or no investment – switch off desktops at night, print fewer documents, turn off power chargers when they are not in use, turn off lights if no one is around.
Extend the period between machine replacements and find ways to reuse retired equipment. The Met Office, for example, sends its end-of-life computers to other measuring stations around the world.
Your next stage should be to look at virtualisation, which can help cut the amount of equipment you need or allow you to take on more work without buying new PCs. You might also be able to optimise your datacentre cooling without massive upheaval. And going even further, large companies can benefit from consolidating multiple datacentres, which can reduce electricity use and other associated costs.
But IT has a much bigger role to play in helping a company achieve good environmental performance.
Technology can help reduce energy use and other pollution in areas which are beyond IT’s remit.
Publishing reference materials online – catalogues, manuals, directories and so on – saves on print, transport and packaging. Videoconferencing can save time and money – and the stresses of work life can be diminished, too.
Home working can also help employees dodge the commute and, depending on how many people do it and for how many days, can also reduce office size and the associated costs.
In a recent survey, Freeform Dynamics discovered that only 28 per cent of IT departments actually know how much energy they use. And, no doubt, the figure would be even smaller if the question were asked with regards to the total IT estate.
If you belong to such a company, perhaps the most effective starting point for everything would be to obtain a copy of its bills. This approach might lead to more granular metering, so that you can see where to apply new measures for saving energy.
David Tebbutt is programme director at analyst Freeform Dynamics. Read the blog at: http://freeform.computing.co.uk
Computing provides insight for IT leaders.
Computing and Computing.co.uk are published in the UK by Incisive Media.
Posted By: Mariella Moon
Another PC that joins the ranks of the recently released Tangent Evergreen 17 and Shuttle X27 energy-efficient PCs is the new Akhter LoCO2 PC. The LoCO2 claims to the world's first Energy Star 4.0 approved all-in-one PC. It's a combined computer which has a 19-inch LCD panel, Intel Core 2 Duo processor, and hard disk drive, in just a single form. LoCO2 consumes 55 watts of energy when in use, and 3 watts when put in Sleep Mode.
In spite of the incorporated elements, the computer maintains a thin profile measuring merely 85mm in depth. Other optional features include a touch panel and 802.11 b/g WiFi connectivity. The Akhter LoCO2 all-in-one PC has a starting price of approximately $1,078 depending on the configuration.
Copyright © 1996-2008 Ziff Davis Publishing Holdings Inc. All Rights Reserved.
Wednesday, August 27, 2008
Cloud Computing, Grids, Everything-as-a-Service and more
Posted on August 19, 2008 at 01:33 AM
by Geva Perry
While the debate on the actual definition of cloud computing rages on, it seems that a whole new cloud computing vocabulary is rapidly emerging. I thougt I'd list some of the new terms I'm seeing with brief definitions, examples of usage and references to discussions related to these terms. Hope this is useful.
Cloudburst: The term cloudburst is being use in two meanings, negative and positive:
- Cloudburst (negative): The failure of a cloud computing environment due to the inability to handle a spike in demand.
Reference:"The only way to do cloud computing efficiently is to share the cloud - to establish a broad, multitenant grid (or a number of them) that balances the loads of many different companies. Otherwise, it'll be one cloudburst after another, and a whole lot of underutilized capital assets." Source: Nicholas Carr: Intuit's cloudburst frustrates customers.
- Cloudburst (positive): The dynamic deployment of a software application that runs on internal organizational compute resources to a public cloud to address a spike in demand.
Reference: "ISV virtual appliances should underpin a new surge in cloud use followed by self-service mechanisms and enterprise connectors enabling organizations to 'cloudburst' to using cloud services." Source: The 451 Group: RightScale rolls its on-ramp toward other cloud systems (subscription required)
Related uses: Cloudbursting. Reference "In addition to direct sales to enterprises, going forward it hopes that extending out from private clouds to public ones – what we like to call 'cloudbursting' – will become a prevailing IT weather pattern and provide it with additional opportunities. " Source: The 451 Group: Q-Layer has the wisdom to enable private clouds (subscription required)
Cloudstorming: The act of connecting multiple cloud computing environments.
Reference: "...Zimory will be covering off the key cloudy marketplaces and activities: public cloud, internal cloud, cloudbursting (grow-over from internal to public clouds) and cloudstorming (connecting multiple clouds)." Source: The 451 Group: A Cloud for All Seasons
Vertical Cloud: A cloud computing environment optimized for use in a particular vertical -- i.e., industry -- or application use case.
Reference: "The verticalization of the cloud would provide marketing benefits, as Friedman notes, while also providing a possible means of addressing issues of information security crucial to industries such as health care and financial services." Source: Nicholas Carr: The vertical cloud
Private Cloud: A cloud computing-like environment within the boundaries of an organization and typically for its exclusive usage.
Reference: "It is these companies that have dramatically leveraged their internal and originally Private Cloud Computing infrastructures to significant economic benefit. " Source: Kent Langley: Private Cloud Computing: A Few Thoughts
Internal Cloud: A cloud computing-like environment within the boundaries of an organization and typically available for exclusive use by said organization.
Reference: "With Cloud Computing becoming more and more popular, large corporations are likely to set up their own clouds and integrate them with external clouds, like Amazon EC2." Source: Markus Klems: Internal Cloud
Hybrid Cloud: A computing environment combining both private (internal) and public (external) cloud computing environments. May either be on a continuous basis or in the form of a 'cloudburst'.
Reference: "Microsoft would, no doubt, agree. Their "software plus services" approach similarly advocates a hybrid cloud/desktop environment." Source: Kendall Whitehouse: Kevin Lynch: Clearing the AIR
Cloudware: A general term referring to a variety of software, typically at the infrastructure level, that enables building, deploying, running or managing applications in a cloud computing environment.
Reference: "Go to Google Maps, Yahoo Mail, or MySpace — most of Web 2.0, in other words — and you're using cloudware." Source: Wired: Geekipedia - Cloudware
External Cloud: A cloud computing environment that is external to the boundaries of the organization. Although it often is, an external cloud is not necessarily a public cloud. Some external clouds make their cloud infrastructure available to specific other organizations and not to the public at-large.
Reference: "If an enterprise were to run an app in an external Cloud and wants to connect that to their systems of record in their own datacenters, they might want to consider the same platform in their data centers." Source: Bert Armijo: Pain in the aaSemantics
Public Cloud: A cloud computing environment that is open for use to the general public, whether individuals, corporations or other types of organizations. Amazon Web Services are an example of a public cloud.
Reference: Gerrit Huizenga: Um, Just who is managing your public cloud?
Cloud Provider: An organization that makes a cloud computing environment available to others, such as an external or public cloud.
Reference: "Some workloads, such as application testing and training, are prime candidates for early deployment to a cloud provider due to their transient nature and high Total Cost of Ownership (TCO)." Source: John Janakiraman: Deploying Your Existing Applications to the Cloud
Cloud Enabler: A general term that refers to organizations (typically vendors) who are not cloud providers per se, but make available technology, such as cloudware, that enables cloud computing.
Cloud-Oriented Architecture (COA): An architecture for IT infrastructure and software applications that is optimized for use in cloud computing environments. The term is not yet in wide use, and as is the case for the term "cloud computing" itself, there is no common or generally accepted definition or specific description of a cloud-oriented architecture.
Reference: James Urquhart: The Principles of Cloud Oriented Architecture
Cloud Service Architecture (CSA): A term coined by Jeff Barr, chief evangelist at Amazon Web Services. The term describes an architecture in which applications and application components act as services on the cloud, which serve other applications within the same cloud environment.
Reference: Jeff Barr: The Emerging Cloud Service Architecture
Virtual Private Cloud (VPC): A term coined by Reuven Cohen, CEO and founder of Enomaly. The term describes a concept that is similar to, and derived from, the familiar concept of a Virtual Private Network (VPN), but applied to cloud computing. It is the notion of turning a public cloud into a virtual private cloud, particularly in terms of security and the ability to create a VPC across components that are both within the cloud and external to it.
Reference: "A VPC is a method for partitioning a public computing utility such as EC2 into quarantined virtual infrastructure. A VPC may encapsulate multiple local and remote resources to appear as a single homogeneous computing environment bridging the ability to securely utilize remote resources as part of an seamless global compute infrastructure." Source: Reuven Cohen: Life in the Cloud: Virtual Private Cloud
Cloud Portability: The ability to move applications (and often their associated data) across cloud computing environments from different cloud providers, as well as across private or internal cloud and public or external clouds.
Cloudsourcing - As defined by Dion Hinchcliffe: "Leveraging services in the network cloud to provide external computing capabilities, often to replace more expensive local IT capabilities.Cloudsourcing can theoretically provide significant economic benefits along with some attendant trade-offs. These trade-offs can include security and performance. The term "cloud" represents a set of external services on a 3rd party network, usually the Internet. The services can represent raw computing, storage, messaging, or more structured capabilities such as vertical and horizontal business applications, even community. These services are delivered over the network, but generally behave as if they are local." Read an overview of cloudsourcing by Dion Hinchcliffe.
John Edwards, InfoWorld
Wednesday, August 27, 2008 1:00 PM PDT
The news that AT&T has joined the rapidly growing ranks of cloud computing providers reinforces the argument that the latest IT outsourcing model is well on its way to becoming a classic disruptive technology.
By enabling datacenter operators to "publish" computing resources -- such as servers, storage, and network connectivity -- cloud computing provides a pay-by-consumption scalable service that's usually free of long-term contracts and is typically application- and OS-independent. The approach also eliminates the need to install any on-site hardware or software.
Currently dominated by Amazon.com and several small startups, cloud computing is increasingly attracting the interest of industry giants, including Google, IBM, and now AT&T.
"Everyone and their dog will be in cloud computing next year," predicts Rebecca Wettemann, vice president of research at Nucleus Research, a technology research firm.
Yet James Staten, an infrastructure and operations analyst at Forrester Research, warns that prospective adopters need to tread carefully in a market that he describes as both immature and evolving. Staten notes that service offerings and service levels vary widely between cloud vendors.
"Shop around," he advises. "We're already seeing big differences in cloud offerings."
To help cut through the confusion, here's a rundown some major cloud providers -- both current and planned -- all offering resources that go beyond basic services such as SaaS (software as a service) applications and Web hosting:
3Tera: Appliance-Driven Virtual Servers
3Tera's AppLogic is a grid engine that has evolved over time into a full-fledged cloud computing environment. The company says its offering is designed to enable datacenters to replace expensive and hard-to-integrate IT infrastructure -- such as firewalls, load balancers, servers, and SANs -- with virtual appliances. Each appliance runs in its own virtual environment.
AppLogic combines servers into a scalable grid that's managed as a single system via a browser or secure shell. According to 3Tera, datacenters can add or remove servers on the fly, monitor hardware, manage user credentials, reboot servers, install software, build virtual appliances, back up the system, repair damaged storage volumes, inspect logs, and perform every other management tasks from a single point of control, all while the system is running.
Amazon.com: As-You-Need-Them Basic IT Resources
Amazon was an early cloud computing proponent, and the company now has one of the market's longest menu of services. Amazon's core cloud offering, the Elastic Compute Cloud (EC2), provides a virtualized cloud infrastructure that's designed to provide scalable compute, storage, and communication facilities.
Amazon's cloud computing arsenal also includes the Simple Storage Service (S3), a persistent storage system, as well as the Simple Database (SimpleDB), which provides a remotely accessible database, and the Simple Queuing Service (SQS), a message queue service that's also an agent for tying together distributed applications created by the EC2, S3, and SimpleDB combo.
AT&T: Scalable Hosting in a Managed Network
AT&T Synaptic Hosting aims to give datacenters the ability to manage applications, compute resources on servers, and stored data elastically, so they can scale up or down as needed. The hosted platform provides dynamic security and storage capabilities as well as a Web portal to manage capacity, conduct maintenance, and monitor network service and performance.
AT&T has long offered hosting services, but not ones that could scale up or down as needed. AT&T's resources and services run within its own network, rather than across datacenters linked via the public Internet, which the company claims provides more certainty over server levels.
Google: Resources for Small Businesses and Developers
Google already offers several cloud-based services, such as e-mail and storage, for consumers, a well as the AppEngine development and provisioning platform for individual developers. The company's logical next step, given its vast infrastructure resources, would be a move into the enterprise cloud market.
"There's not that much difference between the enterprise cloud and the consumer cloud," Google CEO Eric Schmidt said last May during an appearance in Los Angeles with IBM chief Sam Palmisano, as the companies announced a joint cloud computing initiative.
Over the next year or so, Google and IBM plan to roll out a worldwide network of servers for a cloud computing infrastructure. The IBM-Google cloud runs on Linux-based machines using Xen virtualization and Apache Hadoop, an open source implementation of the Google File System. Provisioning is automatic, courtesy of IBM's Tivoli Provisioning Manager.
IBM: A Platform for Your "Internal" Cloud
Aside from its Google venture, IBM is focusing its cloud strategy on "Blue Cloud," a series of cloud computing offerings that will enable computing across a distributed, globally accessible fabric of resources, rather than on local machines or remote server farms. Built on IBM's massive-scale computing initiatives, Blue Cloud aims to give datacenters the ability to establish their own cloud computing architecture to handle the enormous data-processing power required for video, social networking, and other Web 2.0 technologies.
Initially, the Blue Cloud technology must be deployed internally at each organization, essentially as the foundation for an "internal" cloud. The Blue Cloud platform, running on IBM BladeCenters with Power and x86 processors and Tivoli service management software, dynamically provisions and allocates resources as workloads fluctuate for an application. Blue Cloud is being billed as a more distributed computing architecture than typically found in most enterprise datacenters. It is based on Hadoop. Over time, IBM expects to offer Blue Cloud resources on demand, in the provisioned style of Amazon.com and AT&T.
IBM also provides hosting services for SaaS providers, including SAP and SucecssFactors.
Sun Microsystems: An On-Demand Grid, and Perhaps More
With its "the network is the computer" mantra, Sun provided much of the inspiration for the cloud computing movement. And its Sun Grid Engine was one of the first on-demand cloud offerings, providing access to compute and storage resources optimized for parallel-processing applications.
The company also has a research venture dubbed "Project Caroline" meant to provide a configurable pool of virtualized compute, storage, and networking resources to small and medium-size SaaS providers, so they don't need to develop their own infrastructure. There have been recent reports that Sun is planning to turn Project Caroline into a full-blown business, but there's been no official word from they company yet.
TerremarkWorldwide: Resource Pool for On-Demand Servers
The Terremark Enterprise Cloud is designed to give datacenters an Internet-optimized computing infrastructure. Enterprise Cloud clients buy a dedicated resource pool of processing, memory, storage, and networking, from which they can deploy servers on demand. A Web portal allows server to be dynamically provisioned from a pre-allocated pool of dedicated computer resources. Terremark promises that its cloud servers behave exactly like their physical counterparts, allowing applications to be run without modification.
XCalibreCommunications: Self-Provisioned Virtual Servers
Described by some observers as Europe's answer to Amazon's EC2, Scotland-based XCalibre's FlexiScale provides self-provisioning of virtual dedicated servers via a control panel or API. Persistent storage is based on a fully virtualized high-end SAN/NAS back end.
Copyright 1998-2007, PC World Communications, Inc.
Internet access may replace the purchase of texts, Martin says
By Lisa Boone-Wood Journal Reporter
Published: August 27, 2008
Winston-Salem/Forsyth County school officials told the school board last night that buying more online subscriptions for social-studies textbooks might be a necessary next step.
While evaluating a tight budget earlier this year, school officials decided to buy classroom sets of social-studies books for sixth-and seventh-graders, instead of buying books for each student to take home.
After realizing that every book in the classroom set doesn't come with access to an online version, school officials suggested that buying an additional 82 online subscriptions, at a cost of about $5,000.
The additional subscriptions would be bought so more students can access the online textbooks outside of school.
Superintendent Don Martin said that every student has access to the books during the school day and can access online versions of books and other learning tools online if they have Internet access at home.
Students can also go to the more than 40 WinstonNet labs in local libraries and other locations to access the information, he said.
"It's kind of an experiment to see how that works," Martin said. "We will actually evaluate that at the end of the year. If it works well, we won't buy textbooks next year.
"I actually think the opportunity to not carry that big, old book back-and-forth and access the book online is interesting."
Martin told members of the school board last night that he has received several inquiries from parents about the online textbooks. "Everybody doesn't have a computer at home," he said, adding that another possibility in making the textbooks more readily available would be to use the old textbooks.
"We've had no one indicate that they want to go back to the old textbooks, but that is an option," he said.
■ Lisa Boone-Wood can be reached at 727-7232 or at email@example.com
©2008 Media General Communications Holdings, LLC. A Media General company.
About CherryPal for Everyone (CP4Every1 or CPFE)
Please note that all copyrights and links to original material are provided and respected. NO robots were used to post content.
Your comments are invited.
Other CherryPal Brand Angel Blogs
"Okay, get him out of here." -Santa, A Christmas Story - As a parent, most traditional Christmas activities aren't high on my priority list. Yes, it's fun to go pick out a Christmas tree; however, I buy the firs...1 year ago