Showing posts with label Windows. Show all posts
Showing posts with label Windows. Show all posts

Tuesday, 4 November 2008

Stormy Weather: 7 Gotchas in Cloud Computing

When the computer industry buys into a buzzword, it's like getting a pop song stuck in your head. It's all you hear. Worse, the same half-dozen questions about the hyped trend are incessantly paraded out, with responses that succeed mainly in revealing how poorly understood the buzzword actually is.

These days, the hottest buzzphrase is "cloud computing," and for John Willis, a systems management consultant and author of an IT management and cloud blog , the most annoying question is this: Will enterprises embrace this style of computing?

"It's not a binary question," he asserts. "There will be things for the enterprise that will completely make sense and things that won't."

The better question, he says, is whether you understand the various offerings and architectures that fit under that umbrella term, the scenarios where one or more of those offerings would work, and the benefits and downsides of using them.

Even cloud users and proponents don't always recognize the downsides and thus don't prepare for what could go wrong, says Dave Methvin, chief technology officer at PC Pitstop LLC, which uses Amazon.com Inc.'s S3 cloud-based storage system and Google Apps . "They're trusting in the cloud too much and don't realize what the implications are," he says.
With that as prologue, here are seven turbulent areas where current and potential users of cloud computing need to be particularly wary.

Costs, Part I: Cloud Infrastructure Providers
When Brad Jefferson first founded Animoto Productions, a Web service that enables people to turn images and music into high-production video, he chose a Web hosting provider for the company's processing needs. Looking out over the horizon, however, Jefferson could see that the provider wouldn't be able to meet anticipated peak processing requirements.

But rather than investing in in-house servers and staff, Jefferson turned to Amazon's Elastic Compute Cloud, a Web service known as EC2 that provides resizable computing capacity in the cloud, and RightScale Inc., which provides system management for users of Web-based services such as EC2. With EC2, companies pay only for the server capacity they use, and they obtain and configure capacity over the Web.

"This is a capital-intensive business," Jefferson said in a podcast interview with Willis. "We could either go the venture capital route and give away a lot of equity or go to Amazon and pay by the drink."

His decision was validated in April, when usage spiked from 50 EC2 servers to 5,000 in one week. Jefferson says he never could have anticipated such needs. Even if he had, it would have cost millions to build the type of infrastructure that could have handled that spike. And investing in that infrastructure would have been overkill, since that capacity isn't needed all the time, he says.

But paying by the drink might make less economic sense once an application is used at a consistent level, Willis says. In fact, Jefferson says he might consider a hybrid approach when he gets a better sense of Animoto's usage patterns. In-house servers could take care of Animoto's ongoing, persistent requirements, and anything over that could be handled by the cloud.


Costs, Part II: Cloud Storage Providers
Storage in the cloud is another hot topic, but it's important to closely evaluate the costs, says George Crump, founder of Storage Switzerland LLC, an analyst firm that focuses on the virtualization and storage marketplaces.

At about 25 cents per gigabyte per month, cloud-based storage systems look like a huge bargain, Crump says. But although Crump is a proponent of cloud storage, the current cost models don't reflect how storage really works, he says. That's because traditional internal storage systems are designed to reduce storage costs over the life of the data by moving older and less-accessed data to less-expensive media, such as slower disk, tape or optical systems. But today, cloud companies essentially charge the same amount "from Day One to Day 700," Crump says.

Amazon's formula for calculating monthly rates for its S3 cloud storage service is based on the amount of data being stored, the number of access requests made and the number of data transfers, according to Methvin. The more you do, the more you pay.

Crump says that with the constant decline of storage media costs, it's not economical to store data in the cloud over a long period of time.

Cloud storage vendors need to create a different pricing model, he says. One idea is to move data that hasn't been accessed in, say, six months to a slower form of media and charge less for this storage. Users would also need to agree to lower service levels on the older data. "They might charge you $200 for 64G the first year; and the next year, instead of your having to buy more storage, they'd ask permission to archive 32G of the data and charge maybe 4 cents per gigabyte," Crump explains.

To further drive down their own costs and users' monthly fees, providers could store older data on systems that can power down or off when not in use, Crump says.

Sudden Code Changes

With cloud computing, companies have little to no control over when an application service provider decides to make a code change. This can wreak havoc when the code isn't thoroughly tested and doesn't work with all browsers.

That's what happened to users of Los Angeles-based SiteMeter Inc.'s Web traffic analysis system this summer. SiteMeter is a software-as-a-service-based (SaaS) operation that offers an application that works by injecting scripts into the HTML code of Web pages that users want tracked.

In July, the company released code that caused some problems. Any visitor using Internet Explorer to view Web pages with embedded SiteMeter code got an error message. When users began to complain, Web site owners weren't immediately sure where the problem was.

"If it were your own company pushing out live code and a problem occurred, you'd make the connection," Methvin explains. "But in this situation, the people using the cloud service started having users complaining, and it was a couple of hours later when they said, 'Maybe it's SiteMeter.' And sure enough, when they took the code out, it stopped happening."

The problem with the new code was greatly magnified because something had changed in the cloud without the users' knowledge. "There was no clear audit trail that the average user of SiteMeter could see and say, 'Ah, they updated the code,' " Methvin says.

Soon after, SiteMeter unexpectedly upgraded its system, quickly drawing the ire of users such as Michael van der Galien, editor of PoliGazette , a Web-based news and opinion site. The new version was "frustratingly slow and impractical," van der Galien says on his blog.

In addition, he says, current users had to provide a special code to reactivate their accounts, which caused additional frustration. Negative reaction was so immediate and intense that SiteMeter quickly retreated to its old system, much to the relief of van der Galien and hundreds of other users.

"Imagine Microsoft saying, 'As of this date, Word 2003 will cease to exist, and we'll be switching to 2007,' " Methvin says. "Users would all get confused and swamp the help desk, and that's kind of what happened."

Over time, he says, companies such as SiteMeter will learn to use beta programs, announce changes in advance, run systems in parallel and take other measures when making changes. Meanwhile, let the buyer beware.

Service Disruptions
Given the much-discussed outages of Amazon S3 , Google's Gmail and Apple's MobileMe , it's clear that cloud users need to prepare for service disruptions. For starters, they should demand that service providers notify them of current and even potential outages.
"You don't want to be caught by surprise," says Methvin, who uses both S3 and Gmail. Some vendors have relied on passive notification approaches, such as their own blogs, he says, but they're becoming more proactive.

For example, some vendors are providing a status page where users can monitor problems or subscribe to RSS feeds or cell phone alerts that notify them when there's trouble. "If there's a problem, the cloud service should give you feedback as to what's wrong and how to fix it," Methvin says.

Users should also create contingency plans with outages in mind. At PC Pitstop, for instance, an S3 outage would mean users couldn't purchase products on its site, since it relies on cloud storage for downloads. That's why Methvin created a fallback option. If S3 goes down, products can be downloaded from the company's own servers.

PC Pitstop doesn't have a backup plan for Google Apps, but Methvin reasons that with all of its resources, Google would be able to get a system such as e-mail up and running more quickly than his own staffers could if they had to manage a complex system like Microsoft Exchange. "You lose a little bit of control, but it's not necessarily the kind of control you want to have," he says.

Overall, it's important to understand your vendor's fail-over strategy and develop one for yourself. For instance, Palo Alto Software Inc. offers a cloud-based e-mail system that uses a caching strategy to enable continuous use during an outage. Called Email Center Pro, the system relies on S3 for primary storage, but it's designed so that if S3 goes down, users can still view locally cached copies of recent e-mails.

Forrester Research Inc. advises customers to ask whether the cloud service provider has geographically dispersed redundancy built into its architecture and how long it would take to get service running on backup. Others advise prospective users to discuss service-level agreements with vendors and arrange for outage compensation.

Many vendors reimburse customers for lost service. Amazon.com, for example, applies a 10% credit if S3 availability dips below 99.9% in a month.

Vendor Expertise

One of the biggest enticements of cloud computing is the promise of IT without the IT staff. However, veteran cloud users are adamant that this is not what you get. In fact, since many cloud vendors are new companies, their expertise -- especially with enterprise-level needs -- can be thin, says Rene Bonvanie , senior vice president at Serena Software Inc. It's essential to supplement providers' skills with those of your own in-house staff, he adds.

"The reality is that most of the companies operating these services are not nearly as experienced as we hoped they would be," Bonvanie says.

The inexperience shows up in application stability, especially when users need to integrate applications for functions like cross-application customer reporting, he says.

Serena itself provides a cloud-based application life-cycle management system, and it has decided to run most of its own business in the cloud as well. It uses a suite of office productivity applications from Google, a marketing automation application from MarketBright Inc. and an on-demand billing system from Aria Systems Inc.

So far, it has pushed its sales and marketing automation, payroll, intranet management, collaboration software and content management systems to the cloud. The only noncloud application is SAP, for which Serena outsourced management to an offshore firm.
According to Bonvanie, "the elimination of labor associated with cloud computing is greatly exaggerated."

The onus is still on the cloud consumer when it comes to integration. "Not only are you dealing with more moving parts, but they're not always as stable as you might think," he says.
"Today, there's no complete suite of SaaS applications, no equivalent of Oracle or R/3, and I don't think there ever will be," Bonvanie says. "Therefore, we in IT get a few more things pushed to us that are, quite honestly, not trivial."


Global Concerns
Cloud vendors today have a U.S.-centric view of providing services, and they need to adjust to the response-time needs of users around the world, says Reuven Cohen , founder and chief technologist at Enomaly Inc., a cloud infrastructure provider. This means ensuring that the application performs as well for users in, say, London as it does for those in Cincinnati.
Bonvanie agrees. Some cloud vendors "forget that we're more distributed than they are," he says.

For instance, San Bruno, Calif.-based MarketBright's cloud-based marketing application works great for Serena's marketing department in Redwood City, Calif., but performance diminished when personnel in Australia and India began using it. "People should investigate whether the vendor has optimized the application to work well around the world," Bonvanie says. "Don't just do an evaluation a few miles from where the hardware sits."

Worldwide optimization can be accomplished either by situating servers globally or by relying on a Web application acceleration service, also called a content delivery network, such as that of Akamai Technologies Inc. These systems work across the Internet to improve performance, scalability and cost efficiency for users.

Of course, situating servers globally can raise thorny geopolitical issues, Willis points out. Although it would be great to be able to load-balance application servers on demand in the Pacific Rim, Russia, China or Australia, the industry "isn't even close to that yet," he says. "We haven't even started that whole geopolitical discussion."

In fact, Cohen points out, some users outside of the U.S. are wary of hosting data on servers in this country. They cite the USA Patriot Act, which increases the ability of law enforcement agencies to search telephone, e-mail communications, medical and financial records and eases restrictions on foreign-intelligence-gathering within the U.S. The Canadian government, for instance, prohibits the export of certain personal data to the U.S.

"It's hazy and not well defined," Cohen says of the act. "People wonder, 'Can they just go in and take [the data] at a moment's notice, with no notification beforehand?' That's a whole second set of problems to be addressed."

Non-native Applications

Some applications offered on SaaS platforms were originally designed for SaaS; others were rebuilt to work that way. For example, Bonvanie says, there's a very big difference between applications like WebEx and Salesforce.com, which were designed as SaaS offerings, and Aria's billing platform, which was not.

"It's highly complex and fits in the cloud, but its origins are not cloud-based," he says. "If the offering was not born [in] but moved to the cloud, you deal with a different set of restrictions as far as how you can change it."

Whatever "cloud computing" is to you -- an annoying buzzphrase or a vehicle that might power your company into the future -- it's essential to get to know what it really means, how it fits into your computing architecture and what storms you may encounter en route to the cloud.

Computerworld contributing writer in Newton, Mass. Contact her at marybrandel@verizon.net .

This version of the story originally appeared in Computerworld s print edition.

Reference : http://www.pcworld.com/article/153219/cloud_computing_gotcha.html?tk=rss_news

Windows Azure: Why Microsoft Better Get It Right

Microsoft unleashed a new cloud computing ecosystem at its recent Professional Developers Conference event, even though most observers chose to focus on more obvious, though less important, aspects of its announcement.

Essentially, Microsoft is going to create a Windows-based cloud infrastructure. Many of the details of its ultimate offering are still unclear, and it chose to discuss Azure primarily in terms of how it enables Microsoft-offered hosting (dubbed Azure Services Platform) of Microsoft applications like Exchange, Live Services, .NET Services, SQL Server, SharePoint, and Dynamics CRM, and SQL Server. I'm surprised that so many of the commentators (and presumably Microsoft itself) chose to discuss this offering in terms of Microsoft offering SaaS version of its products. While I think this offering approach is interesting, it falls short of revolutionary; by contrast, the revolutionary aspects of Azure have barely been touched on by all the commentary about the new offering. Let me offer my take on what's really interesting about Azure:
Microsoft offering hosted versions (aka SaaS) of its applications is interesting; however, plenty of people already offer hosted versions of its server products (e.g., Exchange, SharePoint). So the mere fact of these apps being available in "the cloud" is nothing new. However, Microsoft has some interesting flexibility here. Other businesses offering hosted versions of these apps have to obtain licenses for them and pay some amount related to the list price of the product. By contrast, Microsoft, as the producer of the products, can -- should it choose -- price its hosted version nearer the marginal license cost of an instance, i.e., near-zero. Of course, Microsoft still has to pay for the infrastructure, operations, etc., but it can clearly, should it choose, obtain a price advantage compated to competitive offerings. This leads us to the next point: infrastructure pricing.

Microsoft, based on the cash flow from its packaged software offerings, clearly has a capital cost advantage compared to its competitors for hosting Microsoft applications. And, based on its experience in hosting Hotmai, Microsoft clearly has operational experience capable of scaling an infrastructure cost-effectively. Added to its ability to price at the margin for software licenses, this obviously provides Microsoft with the ability to be the low-cost provider of Microsoft application hosting. And this advantage doesn't even include the (dare I say it) synergies available to it based on its common ownership of the cloud offering and the applications themselves.

However, focusing on these aspects of the offering is missing the forest for the trees, so to speak; perhaps a better way to say it is that it focuses on the low-hanging fruit without realizing there is much more-and sweeter-fruit available just slightly further beyond the low-hanging stuff. And that's where Azure gets interesting.

First and foremost, Azure offers a way for Microsoft-based applications to be deployed in the cloud. All of the cloud offerings thus far have been Linux-oriented and required Linux-oriented skills. This has been fine for the first generation of cloud developers: they're early adopters most likely to have advanced skills. There is an enormous base of mainstream developers with Windows-based skills, though; corporations are stuffed with Windows developers. Before Azure, these developers were blocked from developing cloud-based applications. With Azure, they can participate in the cloud-which is why other elements of the announcement relating to .NET and SQL Server are so important. These capabilities of Azure will accelerate cloud adoption by enterprises. So Azure's support of the Windows development infrastructure is a big deal.



Cloud Economics Meets Windows Infrastructure
But even that isn't as important as what else Azure will provide: the mix of cloud economics and innovation with Windows infrastructure. One can argue that other providers (e.g., Amazon Web Services) could offer the same capability of hosting Windows infrastructure capabilities, so Azure, on first blush might not seem so important. However, these offerings would face the same issue alluded to earlier, namely, Microsoft's competitive advantage available through marginal license pricing (of course, that might be subject to antitrust issues, so the advantage actually might be moot). However, and this is not in doubt, Microsoft undoubtedly has an unanswerable advantage in that it can extend its components' architectures-at least theoretically-to be better suited to cloud infrastructure. That is to say, Microsoft could, say, take .NET, which today is primarily focused on operating on a single server, and extend it to transparently operate on a farm of servers, scaling up and down depending upon load. And this is where Azure could-could-get revolutionary. Marrying today's widely distributed Microsoft skill base with a cloud-capable architecture based on established Microsoft component development approaches, APIs, etc., could unleash a wave of innovation at least as great as the innovation already seen in EC2 (see this previous blog posting for some insight about today's cloud innovation). In fact, given the relative skill distribution of Linux vs. Windows, one could expect the Azure wave to be even larger.

Of course, this is all future tense, and by no means certain. Microsoft has, in the past, announced many, many initiatives that ultimately fizzled out. More challenging, perhaps, is how a company with large, established revenue streams will nurture a new offering that might clash with those established streams. This is Clayton Christensen territory. It can be all-too-tempting to skew a new offering to "better integrate" with current successful products to the detriment of the newcomer.

Microsoft has a mixed track record in this regard. I won't make a prediction about how it will turn out, but it will be a real challenge. However, cloud computing is, to my mind, at least, too important to fail at.

Cloud computing is at least as important as the move to distributed processing. If you track what distributed processing has meant to business and society-a computing on every desk and in every home, etc., etc.-you begin to get an appreciation for why Microsoft has to successfully address the cloud. Azure is a bet-the-company initiative-and there's a reason they're called bet-the-company: they're too important to fail at. So Microsoft needs to get Azure right.
Bernard Golden is CEO of consulting firm HyperStratus, which specializes in virtualization, cloud computing and related issues. He is also the author of "Virtualization for Dummies," the best-selling book on virtualization to date.


Reference : http://www.pcworld.com/article/153216/ms_windows_azure.html?tk=rss_news

Back up and Restore Boot Camp Partitions

Reader Dave Bradley is trading up, but would like to take his Boot Camp partition along for the ride. He writes:
I'm planning to replace the hard drive in my MacBook Pro with a higher-capacity drive. On that MacBook Pro I have both a partition for my Mac stuff and a Boot Camp partition that has Windows on it. I'm going to use Carbon Copy Cloner to clone the Mac partition to an external drive I have and then restore it to the new drive, but how do I make a copy of the Boot Camp partition?
I'll begin by saying that it's possible. I'll follow that by suggesting that unless you've spent days configuring Windows you might be better off with a fresh install of Boot Camp and Windows. I don't think I'm telling secrets out of school in saying that it takes Windows very little time to get completely junked up. Sometimes starting over is the best course.
But if that means hours and hours of additional work, then cloning and restoring may be your preference. To do that, grab a copy of Two Canoes Software's free Winclone. I used it last week to perform an operation similar to the one you're about to undertake and it worked beautifully.
Just launch Winclone, choose your Boot Camp partition from the Source menu, and click the Image button. As the button's name suggests, this creates an image of that partition and saves it on the Mac side of the drive. Now clone the Mac partition and then swap the drives. Once you've swapped the drives and restored the Mac side, launch Winclone on the new drive, click the Restore tab, drag the Boot Camp image into Winclone's Restore Image field, and click Restore. Winclone will create a new Boot Camp partition on your drive and restore its contents from the image you created earlier.
Note that thanks to Microsoft's Draconian Windows activation scheme it's highly likely that you'll have to activate Windows again. When I did this, online activation was a bust as Microsoft believed that I was trying to exceed my activation limit (because Windows was tied to my old hard drive). Go immediately to phone activation, as telling the nice automated operator that you've installed Windows on only one computer seems to satisfy her to the point that she's willing to cough up the seemingly endless string of numbers that allow you to activate Windows.


Reference : http://www.pcworld.com/article/153238/.html?tk=rss_news

TechEd: New networking features revealed for WIndows 7: DirectAccess and BranchCache

Microsoft is clearly using Tech Ed IT Pro in Barcelona to start revealing details about new features in the upcoming releases of Windows 7 and Windows Server 2008.

One of the first sessions after the openening keynote was about new networking features in Windows 7 and Windows Server 2008. The enhancements clearly focus on making Windows work better from outside the corporate network and in Branch office situations.

Two prominent new features are:
DirectAccess
BranchCache

Direct Access is Microsoft's implementation of what I have earlier referred to as DirectConnect. The technoligy enables user to access resources on the corporate network from a corporate system over the Internet without a VPN using IPv6 and IPSec. Steve Riley already did a demo of this technology at Tech Ed in Orlando. It will be implemented as a client in Windows 7 and as a role on Windows Server 2008 R2.

BranchCache is a new caching technology that locally caches data that is retrieved over a WAN link using SMB or HTTP(S). This enables the next user that needs the same piece of information to retrieve the data without pulling all data over the line again. The demo's showed a substantial gain in speed. BranchCache is implemented in such a way that is preserves the security of the information.

BranchCache can be implemented in two modes:
BranchCache Distributed Cache
BranchCache Hosted Cache

BranchCache Distributed Cache provides a peer to peer mechanism for caching the data at the branch office. In this modes Windows 7 systems request for a local copy of the data at each other before pulling the data over the WAN link. Each time the data is fetched, the client checks with the originating server if the data has not changed and if the security settings allow access. BranchCache Distributed Cache only works for clients at the branch office that operate in the same subnet.

With BranchCache Hosted Cache, a Windows Server 2008 R2 server is assigned at the branch office to hold a cached copy of data that is retrieved over the WAN link. This server is configured with the BranchCache role and assigned as the caching server with Group Policy. It will work as the caching server for all Windows 7 clients at the branch office not regarding their subnet location.

The only downside I see in BranchCache is the fact that you need Windows Server 2008 R2/Windows 7 on both sides of the connection.


Reference : http://www.xpworld.com/

Microsoft: Data Shows Vista More Secure Than XP

Microsoft's latest security report shows that the number of new vulnerabilities found in its software was lower in first half of the year than the last half of 2007, with the Windows Vista OS proving more resistant to exploits than XP.

Microsoft reported 77 vulnerabilities from January to June compared to 116 for the last six months of 2007, according to the company's fifth Security Intelligence Report.

The decline is in line with the software industry as a whole, which saw a 19 percent decrease in vulnerability disclosures compared to the first half of 2007, Microsoft said. However, those vulnerabilities considered highly severe rose 13 percent.

Exploit code was available for about a third of the 77 vulnerabilities; however, reliable exploit code is available for only eight of those 77.

Other data shows that XP is attacked more frequently than Vista. In XP machines, Microsoft's own software contained 42 percent of the vulnerabilities attacked, while 58 percent were in third party software. For Vista machines, Microsoft's software had 6 percent of the vulnerabilities attacked, with third-party software containing 94 percent of the flaws.

New security technologies such as address space randomization have led to fewer successful attacks against Vista, said Vinny Gullotto, general manager of Microsoft's malware protection center.

"Moving onto Vista is clearly a safe bet," Gullotto said. "For us, it's a clear indicator that attacking Vista or trying to exploit Vista specifically is becoming much more difficult."

The highest number of exploits were released for Windows 2000 and Windows Server 2003 operating systems, Microsoft said.

Hackers appear to be increasingly targeting Internet surfers who speak Chinese. Microsoft found that 47 percent of browser-based exploits were executed against systems with Chinese set as the system language.

The most popular browser-based exploit is for the MDAC (Microsoft Data Access Components) bug that was patched (MS06-014) by Microsoft in April 2006.Some 12.1 percent of all exploits encountered on the Internet targeted that flaw. The second most encountered exploit is one aimed at a vulnerability in the RealPlayer multimedia software, CVE-2007-5601.

The two most commonly exploited vulnerabilities in Windows Vista concerned ActiveX controls that are commonly installed in China, Microsoft said.

Gullotto said Microsoft is continuing to improve the Malicious Software Removal Tool (MSRT), a free but very basic security application that can remove some of the most common malware families.

Last month, Microsoft added detection for "Antivirus XP," one of several questionable programs that warn users their PC is infected with malware, Gullotto said. The program badgers users to buy the software, which is of questionable utility. "Antivirus XP" is also very difficult to remove.

Microsoft fielded some 1,000 calls a month about Antivirus XP on its PC Safety line, where users can call and ask security questions. Since the MSRT started automatically removing the program, calls concerning Antivirus XP dropped by half the first week, Gullotto said.

Reference : http://www.pcworld.com/article/153193/windows_security.html?tk=rss_news

Monday, 3 November 2008

Enable Windows 7 PreBeta Build 6801 protected bits!


During PDC ‘08, I was passed a note indicating that I should dig deeper into the bits to discover the snazzy new Taskbar. Upon cursory analysis, I found no evidence of such and dismissed the idea as completely bogus.


I got home and starting doing some research on a potentially new feature called Aero Shake when I stumbled upon an elaborate set of checks tied to various shell-related components, including the new Taskbar.


Update: Although a newer-looking Taskbar is present, it’s not exactly what you saw at PDC ‘08. For example, the Quicklaunch toolbar still exists, Aero Peek doesn’t work properly, and Jumplists are stale. This is likely why it wasn’t enabled, out of the box, so set your expectations accordingly.


To use these, what I call “protected features”, you must meet the following criteria:



  1. Must be a member of an allowed domain

    • wingroup.windeploy.ntdev.microsoft.com

    • ntdev.corp.microsoft.com

    • redmond.corp.microsoft.com



  2. Must not be an employee with a disallowed username prefix

    • a- (temporary employees)

    • v- (contractors/vendors)





Protected Feature Flowchart (click for full)



As checking against this criteria is potentially expensive, in terms of CPU cycles, the result of the check is cached for the duration of Explorer’s lifetime (per protected feature). The cached value is stored within a variable, space of which is allocated in the image’s initialized data section (.data).


Explorer does not initialize these variables at start and checks for a cached result for before performing any checks. I exploited this behavior by setting the initialized value in the image itself to 1 vice 0 to bypass all twelve checks.


Why not use a hook to intercept GetComputerNameExW / GetUserNameW?


I thought about building a hook to inject into the Explorer process upon start, but I grew concerned that legitimate code in Explorer that uses those functions to perform various legitimate tasks would malfunction. And I was lazy.


Can I has too? Plz?


Simply download a copy of a tool I whipped up for either x86 or x64 (untested thus far), drop it into your Windows\ directory and execute the following commands as an Administrator in a command prompt window:



  • takeown /f %windir%\explorer.exe

  • cacls %windir%\explorer.exe /E /G MyUserName:F (replacing MyUserName with your username)

  • taskkill /im explorer.exe /f

  • cd %windir%

  • start unlockProtectedFeatures.exe


After changing the protected feature lock state, you can re-launch the shell by clicking the Launch button.



Screenshot of PDC ‘08 build with new Taskbar


Why did Microsoft do this?


I’m not sure why these features went into the main (winmain) builds wrapped with such protection. What are your thoughts?


Saturday, 1 November 2008

Microsoft Research Demonstrates Technology Breakthroughs at PDC2008

Developers get new tools and a glimpse into future of robotics, Surface, other Microsoft innovations designed to address societal issues and change the computing experience.


Microsoft Research Web site















Front view of Microsoft Surface.
Front view of Microsoft Surface.
Click for hi-res version

Los Angeles – Oct. 29, 2008 – At Microsoft’s Professional Developers Conference 2008, Rick Rashid, senior vice president of Microsoft Research, today showed developers how Microsoft is applying software’s power to tough technological and societal challenges.

Rashid announced the limited release of the first software development toolkit (SDK) for Microsoft Surface, new features for Worldwide Telescope, and the Microsoft CCR and DSS Toolkit 2008, which will make it easier to develop loosely coupled concurrent and distributed applications.

“Advances in software hold the key to progress in multiple fields,” said Rashid. “The modern world generates massive data sets - online search, astronomical phenomena, the climate, particle physics, and the human genome, to name a few areas. With software, we can capture, analyze, and make sense of this data to help combat global warming, develop life-saving vaccines, and enrich our kids’ education.”

In his keynote address, Rashid highlighted the news announcements plus initiatives spearheaded by more than 800 researchers in Microsoft Research’s six global labs that are aimed at easing societal problems society and changing the computing experience:

Microsoft Surface SDK

Microsoft Surface is a computing platform that opens a new chapter in the way people interact with computers by connectng them to digital content through natural gestures, touch, and devices such as wireless phones or even tagged drink glasses. The platform is being opened up to the developer community for the first time at PDC2008 with the limited release of the Microsoft Surface software development kit (SDK). The SDK enables developers to build groundbreaking applications that take advantage of the attributes of Microsoft Surface, which include:

Direct interaction: the ability to execute commands through gesture or touch, rather than via a mouse or keyboard.

Multi –touch: the ability to manipulate multiple on-screen items at once. Surface can read more than 52 individual touches.

Multi-user: new collaborative computing scenarios made possible by Surface’s horizontal form factor.

Object recognition: digital responses to objects placed on Surface – functionality that will ultimately permit the transfer of digital content.

Additional highlights from Rashid’s keynote included a wide range updates and announcements, including:













WorldWide Telescope

Developed by Microsoft Research, WorldWide Telescope is a “Web browser for the sky”, bringing together images from the best ground- and space-based telescopes so people can explore the cosmos from their PC screen. Since its launch in May 2008, more than one million people have downloaded the web application. Several new features are now available for WorldWide Telescope, including a 3-D Solar System, more than 1,000 new images, and a tool that allows people to upload and share their own images of space. Existing users will be prompted to download the new features the next time they open the program. Others can download WorldWide Telescope at http://www.worldwidetelescope.org/













The Microsoft CCR and DSS Toolkit 2008

The Microsoft CCR and DSS Toolkit 2008 delivers a set of .NET- and Compact Framework-class libraries and tools that enable developers to better deal with the inherent complexities of creating looselycoupled concurrent and distributed applications. The Toolkit is designed to help developers take advantage of the Concurrency and Coordination Runtime (CCR) and Decentralized Software Services (DSS) originally released as part of Microsoft Robotics Developer Studio. Microsoft CCR and DSS Toolkit 2008 provide early adopters with access to select technologies today; transitioning to Microsoft’s .NET Framework in the future. To learn more about Microsoft CCR and DSS Toolkit 2008, visit: http://www.microsoft.com/ccrdss

Related Links

CCR DSS Tyco Case Study (video)

CCR DSS Siemens Case Study (written)

CCR DSS Siemens Case Study (video)













Tiny devices, big impact

Scarce energy resources and worries about climate change create challenges and opportunities for computing. The advent of large datacenters that underpin cloud-based computing services make energy-efficient computing is increasingly important. Using technology developed by Microsoft Research, Microsoft is deploying tiny sensors throughout its datacenters to capture data that will allow it to better regulate energy consumption and reduce their carbon footprint. Sensors can also be deployed in the wild to help scientists monitor and track environmental changes. Rashid demonstrated the sensor technology in the auditorium at PDC.

Related Links:

SenseWeb Project













Boku: turning programming into play

Boku is a fun, intellectuallystimulating game, developed by Microsoft Research, that introduces youngsters to programming while they play. Through programming Boku, a virtual robot, children learn the basic principles of programming logic, analysis, and design. The 3-D interactive game is designed to demystify programming and spark interest in a career in science. Youngsters as young asnine years old have already used Boku in trials to create their own games.

Related links:

Boku Web Site

Boku Gameplay Montage (Windows Media)

Boku Programming Walkthrough (Windows Media)













DryadLINQ

Developed by a team from Microsoft Research, DryadLINQ is a powerful programming environment that enables ordinary programmers to write large-scale data parallel applications to run on large PC clusters. The platform comprises Dryad, a distributed execution engine that allows reliable, distributed computing across thousands of servers for large-scale data parallel applications, and the .NET Language Integrated Query, or LINQ, which allows developers to write and debug applications in a SQL-like query language, using the entire .NET library and Microsoft Visual Studio.

Related link:

DryadLINQ Web site













SecondLight: a magic lens that goes beyond the surface

The brainchild of Microsoft Researchers, SecondLight is a rear-projection technology that extends and enriches the Microsoft Surface device through the ability to project images both through and beyond the surface display, such as onto a translucent piece of plastic. With SecondLight, the translucent piece of plastic can also function as a “magic lens.” For instance, if it is” passed over an image displayed on the primary surface – suchas a car – it provides a view of the “inner workings” behind the image. In another application of this so-called “layering effect”, the transparency could register images of constellations when passed over a surface displaying the night sky. The technology also permits gesture-based interactions with the surface from farther away than rear- projected systems allow.

Related Links:

SecondLight Video

White Paper: Going Beyond the Display: A Surface Technology with an Electronically Switchable Diffuser

New Developer Tools and Investments Span from Traditional Application Models to Cloud Development

Microsoft focuses on simplifying core tasks such as data and identity management, allowing developers to spend more time creating innovative applications and user experiences.


LOS ANGELES — Oct. 29, 2008 — This week at its Professional Developers Conference (PDC2008), Microsoft highlighted the company’s strategic shift to Software-plus-Services, announced a new cloud services platform for developers, and offered on-site developers Community Technology Previews (CTPs) for dozens of innovations across its traditional, on-premises application platform.

Microsoft’s announcements at, and leading-up to, PDC underscore the company’s strategic focus on making it easier for developers to build, deploy and manage applications across a broad range of scenarios, while using their existing skills and tools.

“Microsoft is committed to delivering tools for developers of all skill sets, while offering a consistent, integrated experience for software development teams, and providing a secure, reliable solution for developing high quality applications for the latest platforms,” says Dave Mendlen, director of Developer Marketing at Microsoft.

Building a Foundation for Innovation

In September, Microsoft outlined their vision for Visual Studio 2010 and the .NET Framework 4.0 by describing five focus areas: Democratizing ALM (Application Lifecycle Management), Breakthrough Departmental Applications, Inspiring Developer Delight, Riding the Next Generation Platform Wave and Enabling Emerging Trends.

This week at PDC, the dialogue around those focus areas continues, with details on ways to make it easy for developers to use their existing Visual Studio and .NET skills to develop for “the cloud” – software that resides primarily on the web but spans the server, PC and mobile devices as well.

Microsoft also provided a deeper look at the next generation platform opportunities by announcing that Visual Studio 2010 will be optimized to help developers build Windows 7 and Windows Server 2008 R2 applications and take advantage of new Web development features.

These highlights build on a series of recent announcements about new technologies designed to help developers create new software experiences, including:

New enhancements to the Windows Server application capabilities, including Windows Communication Foundation 4.0, Windows Workflow Foundation 4.0, and “Dublin” extensions to Windows Server.

Visual Studio 2010 programming models for concisely expressing “concurrency,” or the ability for applications to efficiently run multiple instructions on a “multicore” or “manycore” processing chip, including new .NET Framework libraries such as the Task Parallel Library and Parallel LINQ, as well as the Parallel Pattern Library and Concurrency Runtime for developing native applications with C++.

Ongoing investments in the .NET Framework through the addition of new functionality in Windows Communication Foundation (WCF) and Windows Workflow Foundation (WF). For cloud computing and services, the tools are available today as separate add-ons for Microsoft Visual Studio 2008 SP1 (Standard Edition or better), and for Microsoft Visual Web Developer 2008 Express Edition SP1. Support for Microsoft Visual Studio 2010 will be available in the near future.

According to Mendlen, the strategy is based on a high-level perspective of the development ecosystem, and the realization that the ultimate goal for every developer is to build engaging, practical new applications for end users.

“By working to make the process of developing simpler and more efficient, we’re empowering developers to focus on using the power of technology to address the needs of today’s marketplace,” he says.

New Platform and Modeling Tools for Declarative Development

Today software development is increasingly characterized by describing processes and links in applications, and less by writing lines and lines of new code. This so-called “model” approach has the potential to save developers untold hours in creating new applications.

“In order to make model-driven development a reality, Microsoft is focused on providing a platform and visual modeling tools that make it easy for all ‘mainstream’ users, including information workers, developers, database architects, software architects business analysts and IT Professionals, to collaborate throughout the application development lifecycle,” says Steven Martin, senior director of Microsoft’s Connected Systems Division.

According to Martin, since modeling is also a key bridge between designing and deploying on-premise and cloud-based applications or components, Microsoft delivered the community technology preview for its new modeling platform this week, dubbed “Oslo” which is made up of three technologies -- a language, a relational repository and a new developer tool, code named “Quadrant.”

“Oslo will enable developers to define solutions across cloud and on-premise environments, built on a consistent modeling experience,” Martin says. “Ultimately we’re aiming to reduce the complexity inherent in building large scale distributed applications.”

A member of the Visual Studio family, the “Quadrant” tool is designed to help define and interact with programming models in a rich and visual manner, including support for the “M” programming language that allows developers to create and use textual domain-specific languages and data models.

Microsoft announced its commitment to publish the “M” language specification under the Open Specification Promise (OSP), which makes it possible for third parties, including open source projects, to build implementations of "M" for other runtimes, services, applications and operating systems.

According to Martin, by putting model-driven innovation directly into the .NET platform, Microsoft is working to help organizations gain visibility and control over applications, ensure they are building systems based on the right requirements, simplify the stages of development and re-use, and enable developers to resolve potential issues at a high level before the company starts committing resources.

“Modeling has been heralded as a means to break down technology and role ‘silos’ in application development, and to assist IT departments in delivering more effective business strategies,” says Martin. “Microsoft believes that modeling is a way to tackle the complexity of distributed application development by helping developers and IT pros to create applications from models, and then execute the models in different runtimes as needed. Our model-driven investments with Oslo will help realize the vision of Dynamic IT by connecting runtimes across Microsoft products such as Windows Server, .NET Framework, Visual Studio, BizTalk Server, SharePoint, and System Center.”

Making it Simpler to Manage Data Across Multiple Sources

In building today’s business applications, developers need to access, manage and synchronize multiple sources of data. With Visual Studio 2008, developers can rapidly take advantage of offline synchronization capabilities to sync-enable applications and services easily, with rich support for designers.

Beginning this week, developers can download the community technology preview of the Sync Framework v2. By embedding the new Sync Framework into their applications, developers can now easily enable any type of data to follow users and customers wherever they go.

“The Microsoft Sync Framework extends the support featured in Visual Studio 2008 to also include offline and peer-to-peer collaboration using any protocol for any data type, and any data store,” says Mark Linton, director of Application Platform Marketing for Microsoft. “This is part of Microsoft’s long-term commitment to providing synchronization for partners and ISVs (Independent Software Vendors).”

In addition, the ADO.NET Entity Framework will enable developers to develop rich applications on top of any data source including SQL Server, IBM DB2, Oracle and many others, meeting developers’ needs for a provider-agnostic approach to data.

“Working with data is a big challenge for a lot of applications and services,” Linton says. “These technologies can help overcome that hurdle, so developers can build new ways to use the data itself and improve the end user experience.”

Along with greater access to and synchronization of data, another new technology announced this week allows information to move faster along the pipeline — Microsoft released the next community technology preview of a new application cache platform code-named “Velocity,” designed to make it easy for organizations to accelerate the development of new applications.

“Velocity” increases performance of applications by moving the data out of the data store and closer to the application in the middle tier, significantly reducing the number of trips made to and from the data store to retrieve it,” says Linton. “Velocity provides developers with a simple and integrated set of APIs, allowing them to easily cache data in the application memory, along with the session state or any other CLR Object.”

Project “Velocity” is also integrated with ASP.NET and the .NET Framework, and will be available for no cost in summer of 2009.

A New Model to Simplify Identity Management

Another time-consuming task for developers today, according to Linton, is managing user identities, which is becoming more important with the growth of cloud-based computing and its associated need for new ways to access software and services from any location or device.

“Working with user identities in applications is hard for developers today, because they must choose from many different identity technologies, and hard-code custom user access logic into every application,” says Linton. “This takes time away from core development work.”

To address this issue, Microsoft announced a new identify management strategy at PDC today, starting with a single, simplified identity model, code-named “Geneva” that works for on-premises and cloud-based applications in the enterprise, in federated networks, and on the consumer Web, using interoperable standards that can interface with a variety of technologies including WS-* and SAML. Geneva consists of three components:

Geneva Framework, which helps developers build claims-aware applications and services that externalize user authentication from the application

Geneva Server, a security token service (STS) that issues and transforms claims, manages user access, and enables automated federation

Windows CardSpace Geneva, which helps users navigate access decisions between multiple identities and control how personal information is used

Also announced as part of Microsoft’s cloud strategy are Microsoft Services Connector and .NET Access Control Service, which are both built on “Geneva” technology and share the same claims architecture.

Preparing for Windows 7

For Visual Studio 2010, the company has also invested heavily in C++ to ease development of native Windows applications.

“We are adding tools to assist developers in building new Windows 7 and Windows Server 2008 R2 applications and enabling existing native applications to take advantage of new Windows features,” Linton says.

According to Linton, the support includes full library and header support for Windows 7, significant updates to MFC to support Windows 7 and Windows Server 2008 R2 UI elements like the ribbon, live icons, search access, and even support for multi-touch enabled interfaces.

“With Windows 7, existing applications will have access to all the new features offered by Microsoft’s latest operating systems,” Linton says. “This gives developers even more opportunities to build engaging features into their applications.”



Reference : http://www.microsoft.com/presspass/features/2008/oct08/10-29SSTools.mspx?rss_fdn=Top%20Stories

Windows 7 to scale to 256 processors

Microsoft has been hinting that even though it had no plans to make major changes to the Windows kernel, it did have a scheme up its sleeve to make Windows 7 and Windows 7 Server better suited to working on multicore/parallel systems. Now details are becoming clearer as to how Microsoft plans to do this.
During the debut of the pre-beta of Windows 7 this week, Windows Engineering Chief Steven Sinofsky made a passing reference to Windows 7 being able to scale to 256 processors. But he never said how this would be enabled.
Mark Russinovich, Technical Fellow in Microsoft’s Core OS division, explained in more detail how Microsoft has managed to do this in a video interview published on Microsoft’s Channel 9 Web site.
Russinovich said that Microsoft has managed to break the dispatcher lock in Windows — a task that had stumped even the father of the Windows NT operating system, David Cutler. When Cutler designed Windows for the server, systems beyond 32-way seemed far, far away, Russinovich said.
On more massively multiprocessor systems, Windows threads spin while waiting for the dispatcher lock. Once Cutler had been moved to work on Microsoft Red Dog (Windows Azure), another kernel developer, Arun Kishan, looked at this problem with a set of fresh eyes and found a solution, Russinovich said. By adding another state — so threads aren’t just running or waiting, but can be “pre-waiting,” as well — Windows will be better suited to running parallel, multithreaded applications running across manycore systems, Russinovich said.
Russinovich noted with the dispatcher-lock roadblock removed, a second set of locks became the new focus for folks working on the Windows kernel. The PFN database inside Windows, which contains information on all of the physical memory in the system, was becoming another scalability bottleneck when trying to get Windows to handle multithreaded apps on massively multicore machines. With Windows 7 and Windows Server 2008 R2 (Windows 7 Server), Microsoft again broke this lock down into finer grain locks, Russinovich said.
I’d expect Microsoft will delve into the ways it is making the next generation of Windows more multiprocessing-capable at the Windows Hardware Engineering Conference (WinHEC) next week in Los Angeles. Stay tuned.
In the meantime, given I’m not a programmer and am trying to channel a very technical Russinovich, it’s probably worth checking out the Channel 9 video interview of him yourself if you care about Windows kernel futures.

Reference : http://blogs.zdnet.com/microsoft/?p=1687

Ozzie Points to Slimmer Future for Windows Client

Microsoft is putting the Windows client OS on a diet as a way to bring the PC OS into the age of cloud computing.

Windows 7, Vista's follow-up, already will be a thinner, more streamlined OS, replacing some of the software Microsoft previously included with the OS with Web-based Windows Live Services. And if comments made by Chief Software Architect Ray Ozzie at Microsoft's Professional Developers Conference (PDC) this week are any indication, Windows will slim down even further in the future, returning to the original intent of an OS -- a way to optimize the hardware it runs on -- instead of being a bloated piece of software whose performance and value rely on compatibility with installed applications.

"The purpose of the OS on the device is to have the best value on that device," Ozzie said at PDC in an interview with the IDG News Service, adding that there is still "tremendous opportunity for innovation" for using the OS to leverage device hardware.

He said that in the future Windows will have "base connections to the Internet" so people can connect to the Web through a browser and services like Windows Update.

But Microsoft won't rely too heavily on the Internet to achieve its goal to support innovative hardware features -- such as touchscreen capability -- so people in places without reliable connections to the Web can still reap the benefits of the OS, he said.

This slimming down of the client OS is as much a way for Microsoft to keep Windows relevant as a hardware OS as it is for the company to concede to the new cloud-computing and services paradigm that Google, Amazon and other companies are pioneering.

Vista might have been a good place to start this evolution, but Microsoft missed the opportunity, said Brian Madden, an independent technology analyst in San Francisco.

He said Vista "would have been great" if it had come out in the late 1990s or even in the early part of the 21st century, the height of the trend to use client-side applications on PCs that is rapidly becoming obsolete as hosted services evolve.

"Vista to me is the culmination of the old way of thinking as the desktop should be," he said, and the fact that it came out in 2007, as the industry was shifting from packaged software to Web-based applications, was "a huge disaster."

Madden called the company's plan to evolve Windows to be lighter and nimble a "reluctant" one. "Microsoft is not leading the way down this path, they're being dragged kicking and screaming by companies like Google," he said.

Andrew Brust, chief, new technology at consulting firm TwentysixNew York, a Microsoft technology partner, has a different take on Microsoft's planned evolution for Windows. He said that the company is trying to re-emphasize the value of having a strong client powered by Windows in combination with the opportunity Web-based applications provide, rather than giving customers the choice between one or the other. Microsoft calls this its "software-plus-services" strategy.

"Unlike Google, which is trying to take AJAX/browser apps and make them look like they're running on the desktop, Ozzie is making the point that the combined value of the Windows OS and assets on the Web -- including, but not limited to, [Windows] Azure and Windows Live -- is Microsoft's play, and a winning play at that," he said.

AJAX, or asynchronous Javascript plus XML, is a development language for creating interactive Web applications. Windows Azure, which Microsoft revealed at PDC, is its cloud-based application development environment that competes with Amazon's Elastic Compute Cloud.

Still, there is no denying Microsoft knows Windows must change as the industry moves away from running software on the client to using Web-based applications. The company's decision not to include Windows Photo Gallery, Windows Mail and Windows Movie Maker as part of Windows 7 in favor of Web-based versions of those applications is part of this trend.

Microsoft is even planning to release a hosted version of the Office productivity suite, which is Microsoft's top software seller next to Windows -- another acknowledgement of the move to hosted services, as well as a nod to competition from Google's Web-based productivity suite, Google Docs. Google Docs is beginning to gain some traction not only with consumers, but also enterprises.

Microsoft plans to release a lightweight hosted version of Office called Web Applications for Office around the same time it releases the next version of the productivity software code-named Office 14, Microsoft revealed at PDC.

Decisions to offer hosted Office and an overall thinner Windows client OS also are in line with the move to offer Windows on low-cost PCs in emerging markets that Microsoft is keen to reach. These PCs have less memory and CPU power, so they can't support an OS with a footprint as big as the premium version of Windows Vista.


Reference : http://www.pcworld.com/article/153100/.html?tk=rss_news

Vista Fights for Relevancy Against Poor Sales, XP, Windows 7

Recent sales numbers for Windows Vista paint a somewhat dreary picture for the OS as consumers and enterprises try to save dollars in an economic downturn.


While Microsoft cited strong overall year-over-year growth in revenue in its fiscal Q1 2009 earnings report last week, the numbers for Windows Vista fell short of expectations, with year-over-year growth of just 2 percent.


The software giant expressed disappointment in Vista's lethargic sales growth, pointing to growth in inexpensive netbooks that use Windows XP or Linux and flat PC sales in developing countries as the two main culprits.


Vista moved only slightly in a quarter that saw 10 to 12 percent growth in overall PC shipments. Meanwhile, Microsoft has been hyping Windows 7 while barely mentioning Vista at its annual PDC (Professional Developers Conference) in Los Angeles. Is Vista being unduly neglected, and is too late for a significant turnaround?


The answer is probably yes to both questions, says Roger Kay, president of research and consulting firm Endpoint Technologies.


Kay says that although Vista is "getting a bad rap," it never generated enough momentum when it launched. "It didn't make the big splash it should have in early 2007. And that window has pretty much closed," he says.


Microsoft's claims that skyrocketing sales of cheap netbooks and flat foreign sales are Vista killers left one Computerworld reader unconvinced.


"Microsoft blames the low uptake of Vista on 'netbooks and foreign sales of less expensive versions of Vista'....That is pure 100 percent unadulterated BS .... There are IT enterprise customers who wouldn't deploy Vista if their life depended on it. The real reason that MS is sucking air on Vista sales is that the consumer market is drying up due to the economy and the enterprise customers (even those with upgrade rights) are staying at Windows XP."


Kay asserts that Microsoft is not hiding anything and that netbooks and flat foreign sales are absolutely hurting Vista. But, he added, "Vista adoption has been at historically low rates. The learning curve is too great and the transition costs are too high for the perceived benefits," he says.


"Meanwhile, Microsoft is greatly affected by the PC lifecycle. If customers delay purchases for financial reasons, as they're doing now, then fewer units go out, affecting license shipments. Add to that a higher proportion of discounted developing world licenses and XP Home and embedded licenses and you have a recipe for slowing revenue growth."


Microsoft, for its part, remains confident that Vista is building momentum, and it stresses that those who have deployed Vista are satisfied.


"We are pleased with Windows Client momentum," says Ben Rudolph of Microsoft's Windows Business Group. "But success isn't measured just by sales-it's also important that our customers are deploying and using the product, and that they're satisfied with the experience. Based on our latest internal research, nearly 90 percent of Vista users are very satisfied or satisfied with the product. Across all of these metrics, we're pleased with the progress we're seeing."


Marginal 2 percent growth notwithstanding, it hasn't been all bad news for Vista over the past six months. Kay of Endpoint Technologies emphasized that "Vista has made improvements, notably with security and interface" and is "an overall good experience these days."


An IDC report from March 2008 predicts a "much stronger adoption curve for Vista on the business side now that Windows Server 2008 has launched." Similarly, an April 2008 report from Forrester Research entitled "Building the Business Case for Windows Vista" lays out the potential "harsh realities" for businesses that skip Vista.


A common Microsoft defense about Vista is that it has been unfairly battered by the press and that Vista adoption among businesses is actually outpacing what Windows XP did in the same timeframe.


Indeed, numbers from an April 2008 research report from Gartner show that both Windows XP and Windows Vista started at the same installed base percentage (4.7 percent) in 2002 and 2007, respectively, and the Windows XP business installed base figure was 16.9 percent in 2003 compared with a projected Windows Vista business installed base of 21.3 percent in 2008.


However, these Vista growth measurements were held up to scrutiny and Microsoft was accused of spin because Windows XP had to compete with Windows 2000 Professional upgrades during its early years, and Vista did not have to deal with such obstacles. In fact, six years had passed between XP's release and Vista's release.


So as we head into the holiday season in an uncertain economy, Vista is getting bounced around between Microsoft's proud claims of widespread customer satisfaction, conflicting research reports of success and failure and screams from industry watchers that Vista is dead.


Kay says that regardless of improvements in Vista and reports of continued growth, Microsoft's mind's eye is set squarely on Windows 7.


Adds Kay, "Microsoft has a history of being more excited by future products than current ones, and that has taken some of the wind out of Vista's sails."




Reference : http://www.pcworld.com/article/153089/vista.html?tk=rss_news

Apple ads blunt Microsoft’s ‘I’m a PC’ campaign

Apple’s anti-Vista response last week to its rival’s “I’m a PC” marketing campaign blunted the impact of Microsoft’s efforts, an Internet video metrics firm said Wednesday.

Although the trio of television advertisements that Apple used to bash Microsoft’s $300 million Windows marketing program were viewed fewer times in their first week than the “I’m a PC” ads were in their first week, Apple’s ads inspired twice as many video placements on the Web, said Matt Cutler, vice president of marketing and analytics at Visible Measures.

Cutler’s company tracks some 160 video sharing sites, scanning each one daily to spot new videos and tally views for those posted earlier. One of its primary jobs for customers is to monitor the “viral” spread of advertising.

“It’s not just about the brands today,” Cutler said. “Fans copy ads, they might mash them up, they might do a spoof. We throw a lasso around all these videos to determine the whole reach of a campaign.”

Last week, Apple hit back at Microsoft’s new “I’m a PC” campaign—which was a follow-up to controversial spots featuring comedian Jerry Seinfeld and former Microsoft CEO Bill Gates—with three spots that poked fun at its rival’s sprucing of Windows. Although ads in Apple's long-running “Get a Mac” campaign typically mock Vista for its perceived problems, the newest ads took aim at the large amount of money Microsoft has devoted to revamping Windows' reputation.

In the third Apple ad, dubbed “Bake Sale,” the “PC” character holds a bake sale to raise money as he bemoans the funds given to advertising. “Since my problems don’t seem to be a priority for them, I’m taking matters into my own hands ... a bake sale,“ says humorist John Hodgman, who plays the PC part.

According to Cutler, Apple’s ads garnered only 70 percent of the views tallied by Microsoft’s campaign. Apple’s ads were viewed approximately 1.2 million times in the first week after they were posted to the Internet; Microsoft’s “I’m a PC” ads, meanwhile, were viewed about 1.7 million times.

“There was lots of anticipation and discussion about Microsoft’s ads,” Cutler noted, especially after the unusual spots that featured Seinfeld and Gates. “It was new and different.”

But Cutler considered Apple’s ad views, even at just 70 percent of Microsoft’s, as a win for the Cupertino, Calif. computer maker. “They were part of an ongoing series, so in that context 70 percent was a pretty darn good number,” he said.

Even more impressive was the broader viral spread of Apple's ads: They generated twice as many “placements”—distinct videos with their own URL—on the Internet as did Microsoft’s campaign. “From our perspective, they seem to be creating more buzz than the average Apple ad,” Cutler said. “If you look at the comments [on the Web], feelings were very mixed about the Apple ads, with people wondering if they were negative attack ads or had gone too far.”

That kind of discussion, or the sheer potential for argument, is crucial if ads are to spread virally, Cutler added. “There’s no guarantee of viral activity, but when an audience gets involved it can significantly increase the reach of a campaign,” Cutler said. “From an ROI perspective, this is very attractive.”

In fact, Microsoft’s Seinfeld-Gates ads were viral monsters after they debuted. “They crushed the ‘I’m a PC’ campaign numbers,” said Cutler, outperforming the follow-ups by “several hundred percent.”

No surprise, really. “Those ads had really challenged people’s perceptions,” Cutler said, talking about the discussions that raged about the ads' effectiveness or whether they even had a point. “They were successful because they didn’t answer very many questions. Virally, that's a good thing.”

Although Visible Measures doesn’t rate ad creative content, Cutler couldn’t resist stepping away from the strict metrics for a moment. “We would have loved to see more things in that [Seinfeld-Gates] series,” he said. “It would have been a much more difficult campaign for Apple to counter.

“If Microsoft had stuck with that, Apple might have responded with an ad that teamed Steve Jobs with Newman,” he said, describing a campaign that used banter between Apple’s CEO and the Seinfeld television comedy character played by Wayne Knight.



Reference : http://www.macworld.com/article/136454/2008/10/imamacads.html?lsrc=rss_main

Windows 7: The 'Dog Food' Tastes Bad

Not wanting to rag on something publicly that I hadn't experienced intimately myself, I decided to take the plunge (called "eating your own dog food" in developer parlance) and see if I could move over full-time to the new Windows 7 M3 pre-beta. After all, with an essentially unmodified kernel and no major changes to the security model, how bad could it be?

Apparently, a lot worse than I thought. After successfully backing up my notebook (Complete PC Backup in Vista is perhaps its single best feature), I fired up the Windows 7 DVD and had the new OS installed in about 25 minutes. Then the fun began.

[ Also read Randall C. Kennedy's earlier blog post "Windows 7: Oops! Microsoft did it again." ]

My first compatibility roadblock involved Daemon Tools. One of the most widely used ISO-mounting utilities, Daemon Tools is a core part of my day-to-day compute stack. It's how I install software into any new system (physical CDs and DVDs are so yesterday), and as such, one of the first things I add to a new installation.

And it broke. Not in any minor, cosmetic way, either. It broke big time. The core "SPD" driver -- kernel-mode component used to simulate a physical CD/DVD drive -- refused to install. This came after I had forced the installer to continue by enabling the "Windows Vista RTM" option in the compatibility tab for its disk file (otherwise, Daemon would refuse to even attempt an install).

In the end, I was able to work around this by installing a competing utility -- Virtual Clone drive. However, that sour taste of a failed transition was already building in my mouth. So when Skype 3.8 started freaking out (randomly crashing a few minutes after initial program load) I knew I was in trouble. The solution here was to download the Skype 4.0 beta. But since like most people I hate what Skype has done with version 4.0 (we're all praying it changes course away from that awful new UI), this was far from an ideal solution. Still, not a show-stopper.

What was a show-stopper was VMware Workstation 6.5. As a software developer and reviewer, I live and die by my ability to create and deploy virtual machines. It's how I review most software packages and also how I test my own code before moving it to a physical machine. So when VMware got all flaky under Windows 7 M3, I started reaching for my portable USB drive -- the one with the Windows Image Backup folder on it.

You see, VMware didn't just get quirky under Windows 7. It became unusable. First, it wouldn't start any of my existing VMs, ostensibly because its privilege requirements or security model was incompatible with the new "neutered" UAC (fewer prompts but more confusion as to what's actually going on behind the scenes). But what really got me steamed was the inability to use the bridged networking option. Though the bridging protocol was present and installed on the desired target adapter, VMware Workstation couldn't see the adapter -- there was no entry for it in the Virtual Network Editor screen, leaving me with NAT and Host Only as my only network options.

In the end, this was the straw that broke the camel's back. I can tolerate a lot of things, but breaking VMware isn't one of them. And since each new version of Windows seems to do exactly that -- break VMware's networking stack -- I'm starting to wonder if there isn't something malicious going on here, sort of like how Microsoft would deliberately break QEMM with each new version of DOS-based Windows.

Regardless, my real takeaway from all of this is that, despite leaving the core Vista kernel and driver model intact, Microsoft is still finding ways to break applications. So much for the whole "seamless transition" promise to Vista users. I can only hope that things get better before RTM or even the official beta launch. But, frankly, even at an M3 revision level, this sort of incompatibility nuttiness simply shouldn't exist -- not for an OS that is just a lipstick tube away from its piggish predecessor.

Reference : http://www.pcworld.com/article/153023/windows_7_dog_food.html?tk=rss_news

Why Windows 7 Will Be Better Than Vista

On the elevator to my hotel room, a 20-something man, too tanned and relaxed to be in the tech industry, spied the massive logo on the shopping bag-like tote that Microsoft doled out at its Professional Developer's Conference. "Windows 7, huh?"


"There's always another one," I said.


Without missing a beat, he replied dryly, "They need another one." This gentleman is not a registered PC. Ironically, my tote's straps were tied around the handle of a less ostentatious rolling bag that cradles a new unibody MacBook Pro. Some things aren't worth getting into in an elevator.


It's really difficult for a savvy user not to bring a cynical, or at least skeptical, viewpoint to Windows 7 after the foot-shooting that was pre-SP1 Vista. What many end-users will see in Windows 7 is an effort to Mac-ify Windows, right down to enabling multitouch gestures on Tablet PCs, and copying Apple is instant, certain buzzkill.


[ See Tom Yager's first look at Windows 7 | Join the Windows 7 conversation in Randall C. Kennedy's Windows Sentinel blog | For more news from Microsoft's Professional Developers Conference, check out InfoWorld's special report ]


Apple claims that Microsoft is suffering a drought of original ideas. Reading between the lines, Microsoft counters that Vista, before Service Pack 1 (it's proud of SP1 and later), was a mess for many reasons, but in part because every yahoo on the Internet was invited to transmit his gripes and fantasies directly to Microsoft product managers, who were then duty-bound to take them seriously.


Look, Don't Listen

That probably works when you're architecting the next rev of SQL Server, because the only people who supply feedback are those with specific expertise. Poll the public at large about what it wants from a client operating system, and you end up with a lot of data that can't be parsed definitively. I've put that too kindly. It's like trying to diagnose a hypochondriac's true ills based on self-reporting of his perceived ailments. It just wastes the time of people who have better uses for it. It's better to tell the patient to shut up and get the story directly from his body and blood.


For Windows 7, Redmond is telling the rabble to put a sock in it, and is instead wiring Windows 7 for more of what it calls "telemetry." Telemetry is the data that unmanned sensors like satellites, planetary rovers, and weather instruments send back to a place where the data is stored and analyzed. When you click the check box in Media Player, Office, or Windows that invites you to help Microsoft make better products, your PC becomes a highly detailed recorder and reporter of your system's state and your usage patterns -- data that gets shipped to Microsoft.


How detailed is this telemetry? Analyzing opt-in telemetry from millions of users, Microsoft determined that a substantial percentage of them change desktop backgrounds fairly frequently. Analysis revealed seasonal and periodic patterns in these changes, so one of Windows 7's features lets you schedule desktop background changes. It's a small thing and a poor example, but Microsoft's telemetry says that people will use it if it's exposed in a friendly way. Let users' usage patterns, not the users, tell the story. Apple never asks customers what they want; it watches what customers do. This delivers an interesting benefit: By analyzing what people do, you can identify workflow snags and spot common work-arounds. Fairly simple enhancements based on this data make it appear that the vendor has been reading your mind. How could they know exactly what I need?


They didn't waste their time, or yours, asking you what you need.


Search to Anywhere

A more significant example of a telemetry-inspired Windows 7 feature is search. Search has been integral to Windows Explorer since Clippy, and yet telemetry indicated that users still spend an inordinate amount of their workday looking for and organizing information. Windows users are just resigned to it. Windows 7 creates Libraries, somewhat akin to OS X's Smart Folders, where data-sharing a project, type, or other criteria can be grouped.


Microsoft also observed that when users search, they want to search all of the sources that they can access regardless of their location. All of the search hits show up in one list, roughly Internet search engine style but better formatted (and without the ads), and the hyperlink in each search hit takes you to the data wherever it lives. Microsoft doesn't credit all of its innovations to telemetry, but one of the features of Federated Search that seems clearly based on behavioral data is that hyperlinks pointing to locations behind the company firewall automatically set up the equivalent of a task-specific VPN session without requiring user interaction.

Microsoft isn't just watching end-users; it has analyzed the patterns of interaction between users and their help desks and found what we in IT already know: Support staff wastes an inordinate amount of time on problems that users can fix themselves, and waiting for a fix is unproductive time for users as well.



See Dick Crash

At its current pre-beta milestone, Windows 7 self-diagnoses and treats a variety of common ills, along the lines of the troubleshooting wizards scattered throughout Vista. What's different in Windows 7 is that the troubleshooters are scripted in PowerShell, making them modifiable and extensible by IT staff. When a user calls the help desk with a more complicated problem, instead of asking the user what they did in order to try to replicate the problem, a client-side activity recorder accumulates UI actions and screen shots at relevant intervals, so they can be shown to an admin or a help desk staffer.


When asked, a user can't retrospectively report the steps that led to a failure. So don't ask. Have them do it again, record it (not watch it via Remote Assistance), study the telemetry, and you'll know. When you find a solution, you can push fixes out in the form of registry patches, replacement code, or global policy changes. Microsoft somehow discovered that fixing what's broken once you figure out what it is happens to be an IT pain point.


Outrage over Microsoft's analysis of user behavior is senseless. Every move you make on the Web is tracked in excruciating detail. Commercial sites use analysis of statistics and usage patterns to change their design and to personalize presentation to target specific groups. There, telemetry is not opt-in and the data gathered is far from anonymous. It's the cost of opening your browser, and if that data weren't collected and analyzed, Amazon, Google, and InfoWorld.com couldn't change as your needs do.




Reference : http://www.pcworld.com/article/153019/windows_7_better_vista.html?tk=rss_news

Nasser Hajloo
a Persian Graphic Designer , Web Designer and Web Developer
n.hajloo@gmail.com

Subscribe feeds via e-mail
Subscribe in my preferred RSS reader

Advertise on this site Sponsored links

Labels And Tags

Archive

Followers

All My Feeds

Computer And Technology News Blog
Hajloo's Daily Note Blog
Development World Blog
Iran ITIL - ITSM Center Blog
Khatmikhi Press Blog
Khatmikhi Blog
Mac OS X in Practice Blog

Subscribe feeds rss Recent Comments

Technorati

Technorati
My authority on technorati
Add this blog to your faves