Monday, 28 July 2008

'World's Cheapest Laptop' Now Available

A company is now selling what it calls the "world's cheapest laptop," which at US$130, is not a bad deal if you can bear some hardware limitations.
The Impulse NPX-9000 laptop has a 7-inch screen and comes with the Linux OS. It has a 400MHz processor, 128M bytes of RAM, 1G byte of flash storage and an optional wireless networking dongle. It includes office productivity software, a Web browser and multimedia software.
There's a caveat though -- it has to bought in bulk, in units of 100. The laptop is available on through the online store of Taiwanese company Carapelli Ltd.
The cheapest laptop to date was known to be One Laptop Per Child's XO laptop, available at $188 for a limited time late last year. While a technological landmark, it had some hardware limitations like a slow processor and limited graphics capabilities.
The laptop hints towards a trend of lowering PC prices. Last week, a company called CherryPal introduced a $249 mini-desktop, also running a 400MHz processor, with 256M bytes of RAM and 4G bytes of internal flash storage.
In a recent interview with the IDG News Service, former OLPC CTO Mary Lou Jepsen said she would bring out a $75 laptop by 2010. Now running her own company, Pixel Qi, she cited the falling prices of RAM and components as a way to bring down laptop prices.
The low-cost laptop industry's poster child is Asustek Computer's Eee PC, which was introduced last year and sold 350,000 PCs in its first quarter. The cheapest Eee PC, for $300, has an 800MHz Intel processor, 512M bytes of RAM and 2G bytes of flash storage।
Reference : http://www।

Vista: the 'New Coke' of tech

Fewer than one in eleven of the PCs being used in large or very large enterprises runs Windows Vista, according to survey results released last week by Forrester Research.
Of the 50,000 enterprise users surveyed by the Cambridge, Mass. analyst firm, 87.1% were still running Windows XP at the end of June, compared to 8.8% for Vista. According to author Thomas Mendel, that implies that the majority of PCs upgraded to Vista were those running older versions of Windows, such as Windows 2000 or 98.
"Vista is 'new Coke,'" Mendel wrote, comparing Microsoft's flagship OS to the ill-fated soft drink. Enterprises still on the fence about Vista would be wise, he said, to "consider following the lead of Microsoft's important partner Intel and re-evaluating the case of Vista."
Mendel's comments undercut the momentum for Vista claimed by Microsoft, which says it has sold 180 million licenses for its 18-month-old operating system to PC makers and end users.
Vista still has double the share of Macs among big businesses, however. The share of Macs grew from to 4.5% in June from 3.7% in January 2008. 80% of those are Intel-based Macs.
Linux's share of desktops, meanwhile, fell significantly, according to Forrester, to 0.5% in June from 1.8% in January.
As a result, enterprise application developers only need to "develop exclusively for Windows XP and Vista. Forget about Macs unless you're aiming at a specific business vertical where Mac use is prevalent."
Forrester's study examined the Web browser as well as the desktop environments of the 50,000 users, spread out among 2,300 companies. It found that 19.4% of enterprise users are using FireFox, up from 16.8% at the beginning of the year. Meanwhile, Microsoft Internet Explorer's (IE's) share only slipped slightly, from 79.1% in January to 77.6% at the end of June.
"At least make sure that applications work on Firefox as well as IE -- this is a must," Mendel wrote.
Apple Inc.'s Safari owns only a small slice of the market -- 2.4%, according to Forrester.
Both Flash and Java were nearly ubiquitous. Flash Player version 9 was on 97% of desktops, while Java was on 99.9% of them. But application developers shouldn't try too hard to jazz up their apps with Flash elements -- "business users don't want to hunt for navigation nor do they crave excitement," Mendel wrote.
Forrester also discovered that despite ever-increasing screens and screen sizes, the largest slice -- 34।1% -- of business users are using screens between 15 and 17 inches in size with resolutions of 1024 by 768 pixels; another 25.2% use screens between 17 and 19 inches in size with resolutions of 1280 by 1024 pixels.

Reference :

Saturday, 26 July 2008

Google Counts More Than 1 Trillion Unique Web URLs

In a discovery that would probably send the Dr. Evil character of the "Austin Powers" movies into cardiac arrest, Google recently detected more than a trillion unique URLs on the Web.
This milestone awed Google search engineers, who are seeing the Web growing by several billion individual pages every day, company officials wrote in a blog post Friday.
In addition to announcing this finding, Google took the opportunity to promote the scope and magnitude of its index.
"We don't index every one of those trillion pages -- many of them are similar to each other, or represent auto-generated content ... that isn't very useful to searchers. But we're proud to have the most comprehensive index of any search engine, and our goal always has been to index all the world's data," wrote Jesse Alpert and Nissan Hajaj, software engineers in Google's Web Search Infrastructure Team.
It had been a while since Google had made public pronouncements about the size of its index, a topic that routinely generated controversy and counterclaims among the major search engine players years ago.
Those days of index-size envy ended when it became clear that most people rarely scan more than two pages of Web results. In other words, what matters is delivering 10 or 20 really relevant Web links, or, even better, a direct factual answer, because few people will wade through 5,000 results to find the desired information.
It will be interesting to see if this announcement from Google, posted on its main official blog, will trigger a round of reactions from rivals like Yahoo, Microsoft and
In the meantime, Google also disclosed interesting information about how and with what frequency it analyzes these links.
"Today, Google downloads the web continuously, collecting updated page information and re-processing the entire web-link graph several times per day। This graph of one trillion URLs is similar to a map made up of one trillion intersections. So multiple times every day, we do the computational equivalent of fully exploring every intersection of every road in the United States. Except it'd be a map about 50,000 times as big as the U.S., with 50,000 times as many roads and intersections," the officials wrote.
Reference :

List the 64-bit GUI Programs on Your Mac

The evolution in computing horsepower is more of everything--first it was more gigahertz, then more cores, and on top of that, more bits for the integer registers inside the CPU. Whereas 32-bit registers have been the historical norm, 64-bit processors are taking over. A CPU with 64-bit registers can address much more RAM than can a 32-bit CPU, which is a big advantage to programs that require large amounts of memory. Today's Intel Core 2 Duo are 64-bit chips, as but one example.
OS X, however, is not yet a fully 64-bit operating system. With the introduction of OS X 10.4, you could run some programs in Terminal in 64-bit mode. Things changed with 10.5, when support was added to allow some 64-bit applications in the GUI, assuming the program has been coded to work in 64-bit mode. (Things will change even more in OS X 10.6 when it comes out in a year or so, as it will be a fully 64-bit operating system.) But enough with the history of the CPU. If you're curious as to which of your programs are already 64-bit, you can find out with a Terminal command.
If you're on an Intel-powered Mac, this command will show you all the programs on your machine that can run in 64-bit mode:
If you're on a PowerPC-powered Mac, use this version instead:
That's actually three separate commands, of course. locate lists all files matching a given condition (in this case, a string that contains indications that something is in Applications, along with the path to the folder containing the actual executable file), and xargs runs a command against the set of matches passed to it (one match at a time) from locate. In this case, that command is file, which returns information about files. Both commands use the -0 option, which replaces the standard line break separator with a NUL character. Finally, a grep command is run to search for a string (either x86_64 or ppc64) within the output of the file command.
When run on my MacBook Pro, the output looks like this:
While the output isn't the easiest to read, it's still relatively straightforward--just look for the ".app" portion of the string. In my case, the 64-bit programs on my MacBook Pro include Chess, Java Preferences and Java Web Start, Araxis Merge, etc.
If this command doesn't work for you, it's possible your locate database hasn't been created (OS X's maintenance routines should create it if they get the chance to run). To create the database manually, use this Terminal command:
That command will take a while to run, depending on the size of your hard drive। Once it's done, though, you can use the above commands to check and see which of your programs are ready for the coming 64-bit world.
Reference :

Microsoft Bolsters Ruby Efforts

Microsoft on Thursday plans to delve deeper into Ruby programming, with plans to ship Ruby libraries and participate in a testing project for the language.
The libraries are akin to any other software library, helping developers build software.
The company at the O'Reilly Open Source Convention (OSCON) also will announce intentions to participate in the RubySpec project, which features a standard test suite used to define a compliant Ruby implementation.
In a prepared statement, Microsoft's John Lam, program manager for the company's Dynamic Language Runtime team, stressed the company's Ruby backing.
"All of these [OSCON] announcements underscore our commitment to listening to customer feedback and ensuring that we are true to Ruby as a language while still bringing the full benefits of .Net programming to the Ruby user base," Lam said.
IronRuby, a version of Ruby for Microsoft's .Net platform, is in development at the company, which as of Wednesday morning had not yet announced a release date for the 1.0 version.
Also at OSCON, Microsoft will unveil IronRuby-Contrib, a Microsoft Public License (Ms-PL) open source project for collaborative development of code supporting IronRuby or the underlying platform, but not part of the IronRuby distribution. An example of such a project would be the Ruby on Rails plug-in built to make it easier for Rails developers to add Microsoft's Silverlight rich Internet application technology to their applications, a Microsoft representative said.
Under Ms-PL, licensees can change source code and share it with others. They also can charge a licensing fee for modified work. Microsoft uses this license most commonly for developer tools, applications, and components.
While often criticized by open-source advocates, Microsoft nonetheless has established a presence at OSCON this week, with its sponsorship of the Participate08 session at OSCON, which was focused on boosting dialogue about open source and other collaborative communities.
On Friday at OSCON, Sam Ramji, Microsoft director of platform strategy, is scheduled to present on "Open Source Heroes." His brief talk will cover Microsoft community participation and ways in which Microsoft plans to contribute during the next 10 years of open-source development, according to the conference program.
For more IT analysis and commentary on emerging technologies, visit। Story copyright © 2007 InfoWorld Media Group. All rights reserved.
Reference :

Three Keys to Getting Projects Under Control

As organizations struggle to extract the expected value from projects they fund, managers are charmed by a wide array of fads, techniques, marketing hype and buzzwords. Every proposed solution trumpets its track record in some particular situation. But, once the "latest and greatest, new and improved" tool is actually deployed, more often than not, reality seems to walk productivity off the cliff. And the sponsor is left with late, deficient and over-budget deliverables.
There are projects that deliver on time and within budget, but they are the exception rather than the rule. And frequently, these "successes" would not glow very brightly if management peeled back a reporting layer or two or rigorously compared what was delivered versus what was originally planned. Simply put-and the available statistics back up this statement-most projects are out of control.
According to the Standish Group, 70 percent of projects are over budget or behind schedule. The published research indicates that 52 percent of all projects finish at 89 percent of their initial budget. Parenthetically, there is no indication anywhere that 48 percent of all projects finish at 11 percent of their initial budget. If there were, the good would balance out the bad. But unfortunately, that's not the case. Most projects are out of control.
In the first article in this series, we explored the challenges created by segmenting projects into discrete chunks, one of the latest solutions sweeping the industry. First of all, the time required to maintain that approach frequently results in a significant loss of productivity; i.e., we don't get done any faster, but we know a lot more about where we are at any given point in time. And secondly, as organizational seasons change, the shifting winds and currents dilute the overall effort, frequently resulting in an overarching project being dropped midcourse. So, the legacy of the small and quick approach remains consistent with history.
Two decades of successful project management in IT, capital construction, engineering and aerospace have revealed three keys to getting projects under control: plug leaks, have an idea and go granular.
In the first article we explored the first key to getting projects under control: "Plug Leaks." You plug leaks by clearly defining and enforcing the acceptable range of diversion. In this article, we will explore the second key to getting your projects under control: "Have an Idea."
Have an Idea
There is one simple way to clarify the murk of multiple projects. Instead of vague targets like "improve" productivity or throughput or client experience, think in very concrete terms: "Have an idea." In other words, know exactly what you are trying to accomplish. You know you "have an idea" when you can answer these questions. Where are you going? How are you going to get there? What will it cost? What is the payoff?
Management sponsors projects to solve problems, but all too often, it is unclear what the relationship is between the cost of the problem, the ultimate savings associated with solving it and the investment required to solve it. And, every once in a while, managers need to climb on top of their desks to see the "big picture" and then communicate what they see. Lower-level people can't take the big picture into account if no one tells them what it is. At every decision point, every stakeholder needs to know more than just "turn left" or "turn right." They need to know whether the destination is Boston or L.A.
A major challenge to answering, How are you going to get there? And how much will it cost? arises when projects are planned by small groups that are insulated from the realities of what it takes to actually complete the assignment. Besides the burden of isolation, they often suffer even more because they are under the gun to complete a proposal within rigid time constraints.
For example, a team includes an arbitrary one-month lead time for securing a subcontractor, when in reality it takes six to eight weeks just to complete the RFP cycle, let alone the time required for negotiations and contract sign-off. Or someone reviewing a proposal changes the scope or terms without adequate feedback because "that's what the client wants; we'll have to make it happen."
The last question may be the most important one: What is the payoff of completing the project? The core of "Have an idea" is identifying the quantifiable benefit, and then highlighting it regularly at all levels.
For example, what would it take to specify a new dam and electric power plant? You would have to begin by clearly defining the payoff. You can have a need for more electricity, a suitable site, and the resources to build and maintain such a plant, but it is not feasible if it will not generate enough revenue. That is the payoff, the quantified benefit.
This will not only deepen the team's commitment to the result but also reduce the impact of environmental shifts. Because the team understands where it is going, how it is going to get there, how much it will cost and the payoff, when there is a shift in management or some new buzzword sweeps the industry, the changes it brings will fold into the project instead of derailing it.
So, to reiterate, you "have an Idea" when management and team members can specifically answer the questions: Where are you going? How are you going to get there? What will it cost? What is the payoff?
Having now covered the second key to getting projects under control, "Have an Idea," along with the first key, " Plug Leaks," in the next, and last, article in this series will explore the last key: "Go Granular."
John Troyer has more than 20 years of successful experience leading teams as a project, program, implementation, deployment and department manager in a wide variety of disciplines and environments including DoD, aerospace engineering, IT, capital construction, finance, procurement and cost reduction।
Reference :

Smart Robots Will Explore Universe by 2020

Before the year 2020, scientists are expected to launch intelligent space robots that will venture out to explore the universe for us.
Researchers are working on creating autonomous spacecrafts that will be able to analyze data about points of interest as it passes and then make quick decisions about what needs to be investigated, according to Wolfgang Fink, a physicist and senior researcher at the California Institute of Technology.
"Robotic exploration probably will always be the trail blazer for human exploration of far space," said Fink. "We haven't yet landed a human being on Mars but we have a robot there now. In that sense, it's much easier to send a robotic explorer. When you can take the human out of the loop, that is becoming very exciting."
NASA and the Jet Propulsion Laboratory are using a robotic arm onboard the Mars Lander that's been working on the Martian north pole for more than a month and a half now. Programmers send up daily instruction code to the arm, telling it to dig trenches in the soil or scoop up ice scrapings and deposit them in one of the analysis tools onboard.
Find said that's a great start but he's looking forward to the day when the robots can make at least some of the decisions for themselves.
"The arms are the tools, but it's about the intent to move the arms," he added. "That's what we're after. To [have the robot] know that something there is interesting and that's where it needs to go and then to go get a sample from it. That's what we've after. You want to get rid of the joystick, in other words. You want the system to take control of itself and then basically use its own tools to explore."
The physicist said he envisions a time when humans send out intelligent probes to explore the far reaches of the universe and send information back to Earth -- without having to send people on excruciatingly long and dangerous space missions.
"In the old Star Wars movies, especially in the Empire Strikes Back, the empire was sending out probes or floating robots," said Fink. "Those were ideal robotic explorers because they floated over planets and had sensors and communication capabilities. Once you venture out to other planets, you need something that can operate on its own. You can't monitor and supervise every single step. You want to deploy something that, on its own, can start a reconnaissance of the area and report back."
What will make the spacecraft or space robot intelligent is its ability to recognize something of interest -- say, a crater on a planet or an asteroid -- and then decide to go investigate. And giving a machine that complex ability will be no easy task, but CalTech scientists already have begun working on it.
According to Fink, CalTech is working with scientists at the University of Arizona to develop software packages that use camera images to enable machines to distinguish colors, shapes, textures and obstacles. With the ability to pick out these features, the software can begin to calculate what is anomalous -- much like the children's game of 'which one of these things does not belong?'
Researchers have hooked the software up to a rover and soon will be linking it to the rover's navigation functions.
The researchers also are working on a wish list of sorts for the spacecraft. The list would include things that NASA and university scientists would like the robot to investigate. "It's very difficult to teach a spacecraft," said Fink. "When a geologist goes into the field, they can tell you if they see something that sparks their interest. Based on that interest, it triggers more refined research. But the problem is if you encounter something that scientists had not foreseen, then you run the risk of not detecting it We'll equip it with a database and a wish list, along with the ability to flag an anomaly."
Fink said NASA has shown some interest in their work. And that makes sense since NASA is planning an unmanned mission to Titan, Saturn's largest moon, around 2017. The CalTech physicist explained that an orbiter would most likely release a balloon-type vehicle that would float above the surface of the moon and send its findings back to Earth.
"It takes more than hour to send communications back and forth to a space probe at Saturn or Titan," said Fink। "It is not a problem so much if you are dealing with a Lander, which is immobile, or when you're dealing with a rover which is not moving too fast. It becomes a significant problem if you deploy a balloon or air ship on Titan, let's say. They are floating so you need a much quicker reaction time. If there's a mountain or hill coming up, you need to make a decision right there and then to avoid it."
Reference :

14 Common Project Management Mistakes

It's no wonder only 29 percent of IT projects are completed successfully, according to The Standish Group. Project management consultants and software providers say they see IT departments making the same project management mistakes over and over: IT groups don't follow standard project management processes. They don't have the right staff working on projects. They don't assess the risks that could imperil their projects or determine ways to mitigate those risks. The list of mistakes unrolls like a ball of yarn.
Most of the project management mistakes IT departments make boil down to either a lack of adequate planning or breakdowns in communication (either among the project team or between the project team and the project sponsors). These mistakes can be fatal. They can also be avoided. And who better to point out the most common project management mistakes than project management vendors and consultants. (They also suggest ways to avoid them.)
The following list of the 14 most common project management mistakes ought to help you pinpoint where your projects are going wrong and measures you can take to improve them. The upside of avoiding these most common project management pitfalls is tremendous. Not only will your project success rate increase, you'll also improve satisfaction among internal customers, IT's stock inside the organization will increase in value, and the business will benefit from systems that make them more competitive that get delivered on time and on budget.
Staffing मिस्ताकेस

Mistake No. 1: Projects Lack the Right Resources with the Right Skills.
Impact: Proper project staffing is critical, yet improperly allocating resources tops the list of most common project management mistakes. Not having the right people on a project can kill it. "The key to getting a project successfully accomplished is getting the right people with the right skills," says Joel Koppelman, CEO of project management software vendor Primavera. "All the planning in the world won't overcome an insufficiency of talent."
Solution: IT and project managers need full visibility into the skills and workloads of all of their resources, including consultants, contractors and outsourcers, who often get left out of skills assessments even though they're doing a "huge" proportion of work, says Koppelman. Project management software can provide such visibility into everyone's skills and workloads.
Once IT and project managers know who's doing what, they have to figure out how to allocate resources across myriad projects and day-to-day work.
"There are all kinds of organizational models," says Richard Scannell, co-founder of IT infrastructure consultancy GlassHouse Technologies. "I've never seen anything that works well. There's no easy answer [to the resource allocation question]."
You just have to try synchronizing people and projects as best you can, says Koppelman, adding that one potential solution is to appoint a resource manager who's responsible for figuring out who will be assigned to each project and for ensuring there's a fair allocation of talent across projects.
Scannell suggests setting up "tiger teams" where people get taken out of their traditional job responsibility for a year or more to work on a specific project. Ken Cheney, director of HP Software's PPM Center, recommends assigning resources at a project level as opposed to a specific task level, which he says is much more arduous.
If you're still hard-pressed to adequately staff projects, you may be able to free up resources by cancelling a "discretionary" project (e.g. one that isn't tightly tied to the business strategy), says Cheney. He suggests looking at your entire portfolio of projects your IT staff is working on to identify ones that aren't mission-critical। "By stopping those projects and reallocating resources to projects that will have the biggest impact, the organization as a whole can be much more successful," he says.

Mistake No. 2: Projects Lack Experienced Project Managers.
Impact: Projects can quickly grow out of control without a savvy project manager at the helm.
Solution: Hire project managers with certifications and the finesse required to manage stakeholders. Matthew Strazza, vice president of services (North America) for CA, says good project managers have to have strong soft skills. They need to know how to facilitate meetings, manage risk and handle a variety of different stakeholders-the business people who are looking for functionality, the IT people who care about security, and the financial people who are worried about the budget.
"If you're not addressing the financials, managing the budget on a week-to-week basis and notifying the client of any change, you can get into trouble pretty quickly," says Strazza.
Good project managers also need to possess technical expertise in whatever technology is being deployed, he adds।

Process Mistakes
Mistake No. 3: IT Doesn't Follow a Standard, Repeatable Project Management Process.
Impact: This is the second of the most common project management mistakes. Lack of methodology increases the risk that tasks related to the project will fall through the cracks, that projects will have to be re-worked, and ultimately that a project won't be completed on time or on budget.
Solution: A project management methodology helps you tackle projects efficiently and makes you aware of all the activities involved in the execution of a project.
"Having in place a baseline of standards and methodologies will remove a lot of the risk associated with IT projects," says HP's Cheney.
Douglas Clark, CEO of Métier, a provider of project portfolio management solutions, recommends establishing repeatable processes for scoping, scheduling, allocating resources and communicating with stakeholders। "Those are the things you want to get a handle on first because they would probably give you the biggest payoff," he says.

Mistake No. 4: IT Gets Hamstrung by Too Much Process.
Impact: Too much process makes the project team inflexible, and their inflexibility frustrates stakeholders.
Fumi Kondo, managing director of NYC-based consultancy Intellilink Solutions, once observed an exchange between a software developer and a project manager where the developer told the project manager that he could add extra features to an application with no additional effort. The project manager told the developer not to add the extra features because users hadn't asked for them. "My response would have been,'Go to the users and see if those features are useful,'" says Kondo. "I see nothing wrong with over-delivering if it doesn't impact the budget or the schedule."
Solution: Be flexible and communicate with project sponsors and stakeholders।

Mistake No. 5: They Don't Track Changes to the Scope of the Project.
Implication: The budget for the project explodes. So does the timeline.
Solution: CA's Strazza recommends following a formal change request process: The individual requesting the change in scope (e।g. additional features or functionality) needs to explain the specific changes on a change-in-scope document, and the project manager needs to determine how that request will impact the budget and timeline. The project sponsor has to sign off on the change-in-scope request.

Mistake No. 6: They Lack Up-to-Date Data About the Status of Projects.
Impact: You can't manage what you can't measure, as Peter Drucker would say. Nor can you coordinate resources or react to changes in scope, says HP's Cheney.
Solution: Software।

Mistake No. 7: They Ignore Problems.
Impact: Problems don't solve themselves. They fester the longer you ignore them and ultimately compound the cost of the project.
Solution: "If you do something wrong, it's about how well you fix it," says GlassHouse Technologies' Scannell. "Most people batten down the hatches and look up in the month. Understanding when you're starting to fail and quickly being able to engage as many stakeholders as possible to fix it is critical."
Planning मिस्ताकेस

Mistake No. 8: They Don't Take the Time to Define the Scope of a Project.
Impact: If a project's scope isn't well-defined by the business and IT up front, the project can end up ballooning like Friends actor Matthew Perry in the sitcom's later seasons. What's more, IT lacks the clarity and direction it needs to complete the project on time and on budget and meet the business's expectations.
Solution: Ill-defined projects are best served by a business case and a scoping exercise, says Intellilink Solutions' Kondo।

Mistake No. 9: They Fail to See the Dependencies Between Projects.
Impact: Projects don't happen in isolation. They're often dependant on other projects going on at the same time. When project managers fail to see the dependencies between projects-such as staff assigned to one project are needed on another&mdashh;projects get held up. Such slowdowns can have a ripple effect on all projects.
Solution: Take dependencies into account during project planning, says Métier's Clark। Talking with stakeholders and diagramming the project can help uncover dependencies.

Mistake No. 10: They Don't Consider Murphy's Law.
Impact: Stuff happens, and IT gets surprised by it. Consequently, the project goes off-track while IT tries to clean up a mess it didn't anticipate.
GlassHouse Technologies' Scannell recalls a company in the U.K. that his firm acquired, that was moving its mainframe to a new data center. The IT group devoted an entire Saturday to taking down the mainframe so that they could move it to the new data center the next day, he says. While the IT staff were en route to the new data center with the mainframe on Sunday, they ran into a gay pride parade, and they couldn't reach their destination due to roads blocked off for the parade. They had to drive back to the original data center and put Humpty Dumpty back together again. The lack of planning caused the IT staffers to do more work than was necessary.
Solution: Perform a risk assessment as part of the project planning। With your team, brainstorm what could happen to slow or derail the project, to make it go over budget, or to prevent you from delivering the expected requirements. Then figure out ways you can mitigate those risks, says Primavera CEO Koppelman. "If they sit down and think about those risks, they'll come up with a pretty good list," he says. "This exercise doesn't take a long time, and it's enormously helpful in understanding the soft spots in a project before it even gets underway."

Mistake No. 11: They Give Short Shrift to Change Management.
Impact: All the time, money and hard work that went into delivering a new IT-enabled capability can be for naught if users don't adopt the new technology.
Solution: Spend time up front during the project planning phase to consider where resistance to a project will manifest itself and ways to address it, says Métier's Clark। Identify the stakeholders whose jobs will be impacted by the new capability, adds Intellilink Solutions' Kondo, and plan how you'll communicate changes to their processes and workflows with them. Not all of the changes will be negative.

Mistake No. 12: Project Schedules Are Incomplete.
Impact: Project team members don't know what is due when, which makes completing the project on time a challenge.
Solution: Clark says a quick way to come up with a schedule for a project is to determine all the activities involved in getting the project done (e.g. scoping, getting requirements, testing and implementing) and then attaching due dates to those activities based on the deadline for the project. Project management software can also help create schedules.

Communication Problems
Mistake No. 13: IT Doesn't Push Back on Unreasonable Deadlines.
Impact: IT sets itself up to fail and gets a reputation for not being able to deliver projects on time.
Clark says IT departments will scramble to accommodate project deadlines set by the CEO. But tampering with dependencies and with the plan only creates more problems that make delivering the project on time even more difficult, he says.
Solution: IT management has to explain to the CEO what it's going to take to meet that deadline in terms of cost and resources and has to get the CEO to choose between cost, scope and schedule, says Clark।

Mistake No. 14: They Don't Communicate Well with Project Sponsors and Stakeholders.
Impact: IT fails to deliver the expected requirements.
Solution: Project communications need to be catered to the audience, says Kondo. She sees misunderstandings about the scope of a project or a systems' requirements arise when IT departments hand over a spreadsheet to the business with thousands of lines describing the systems' functionality and specs. Because the business owners don't have time to look over such detailed technical documents, they ignore them.
"One side is communicating, but in a language the other side can't understand," says Kondo. "Then IT gets frustrated and they say,'We described this to them. How come this isn't what they want?'" ( Business analysts play a critical role as the liaisons between users and IT.)
Kondo recommends giving every stakeholder who will be impacted or involved in the project on the business side a high-level overview of the entire project, from design to rollout. The overview should highlight the activities that require interaction with the business and should explain why the business is needed, she says.
In general, IT needs to put more effort into educating the business about the steps involved in executing a project, says Kondo.
"If you have an open dialog about what's needed, what you're really delivering, and you have fluidity built into the process, the budget and scope becomes a dialog so if you go over budget, it's not necessarily a failure," she says.
Kondo's firm once worked with a client that was deploying a financial system and whose employees had never been involved in a large system implementation before. When design of the system was complete and Intellilink was beginning to plan for testing, Intellilink explained to the employees why testing was important.
"We told them about different kinds of testing and what they did and didn't need to be involved in। We talked about why we needed user input, what kind of input we'd need and how much time it required," says Kondo. "That gave people an idea of why it takes so long to test."

Reference :

The Death of DRM

As Yahoo says it will shut down the servers that allow transferring its DRM music, Britain considers a move that would green light unfettered music downloads for a yearly tax. Yahoo rang the latest note in the seeming death knell for DRM-protected music as the company decided to "abandon customers who bought tracks from its music store encoded with DRM," according to the IDG News Service.
The company will shut down the Yahoo! Music Store, along with the servers that renew DRM licenses, on Sept. 30. After that, according to the notification e-mail from Yahoo, "you will not be able to transfer songs to unauthorized computers or re-license these songs after changing operating systems (full text available from the LA Times)." Songs currently playing on a PC should continue to do so until the OS changes, and "backing up your music to an audio CD will allow you to copy the music back to your computer again if the license keys for your original music files cannot be retrieved," per the e-mail.
The timing of the move is somewhat surprising given that Microsoft just tried to do the same, but last month reversed course because of consumer backlash and announced it would keep the license-renewal servers going until 2011. But it's not surprising given how DRM has been faring overall. It's dead, Jim.
With Apple leading the way, more and more companies are offering DRM-free music for sale. Musicians are going one step further - Techdirt heaps praise on Trent Reznor for intelligently offering a limited edition CD for sale as he offers the tracks themselves as a free download.
And finally, on a much larger scale, BusinessWeek reports that Britain is mulling a plan to levy a yearly tax on broadband users. In exchange, people could legally download as much music as they wanted, which makes it sound like a government-run national subscription service. The tax proceeds would be distributed among the copyright holders.
An interesting idea, to say the least। I don't know if it would work, but it sure couldn't fail as badly as DRM.
Reference :

Augmented Reality Coming to Your Desktop Soon

A Tokyo-based start-up has taken the wraps off new "augmented reality" software that allows the real world and computer imagery to meet on the desktop.
That's your actual desktop, not the Windows one. And it won't be a surprise to anyone that's spent time with computer graphics in Japan that this new world is inhabited by a cute, computer-generated girl.
Point your webcam at your desk and the software will display the image on your computer screen. Place a special cube with a 2D bar code on the desk in the center of the webcam's field of view, and the software will overlay the CG character, called "Aris" (say it like Alice), on the video image at the position of the cube.
Leave her alone and she'll tug at her blue and white maid's outfit, sit around and even appear to clean your desk if you're lucky.
But bring another coded cube -- mounted on a short plastic stick for easy use -- close-by and you can interact with the character.
You can poke her and annoy her in various ways and even strip her down to a skimpy bikini. She'll complain at this abuse but still comply, so if you're feeling guilty you can give her a present represented by another coded cube. She gets happier when she sees the present and positively adores you when she finds out it's a teddy bear.
The software transforms this real world of a few coded cubes sitting on your desk into the augmented reality world, visible through your computer monitor, where the computer-generated Alice is dancing around on your desk and interacting with you.
If you think stripping down CG characters to their swimsuits is a little low-brow for such technology consider this: the company showing the system, Geisha Tokyo Entertainment, was formed and is largely staffed by graduates of the University of Tokyo, Japan's top university, and this first software is more about exploring the market and perfecting the technology before they go on to tackle bigger entertainment projects using augmented reality.
"We don't expect to sell many of this but we hope it provides the base to go on to bigger and more famous characters," said Taisei Tanaka, CEO of the company during an interview at this week's Wireless Japan show in Tokyo.
Alice should be arriving in October in Japan and the company has plans to sell the software elsewhere, including the U।S. and China.
Reference :

One of Gates' Favorite Themes Continues in His Absence

This year, the biggest crowd at lunch during Microsoft's analyst meeting was at CEO Steve Ballmer's table, with people standing two rows deep around the lucky few who actually scored a seat.
In past years the honor would have gone to Bill Gates, who stopped working full time at the company last month. But although Gates wasn't at the meeting in person, at least one project that has been near and dear to his heart still made it into the presentations.
Craig Mundie, Microsoft's chief research and strategy officer, showed off some futuristic ideas for a natural user interface -- one of Gates' favorite themes over the past few years.
Mundie demonstrated a future application for Surface, Microsoft's multitouch tabletop computer. In the scenario he described, the computer could appear in a hotel room, comprised of both a tabletop and a large vertical screen on the wall.
Mundie placed his phone, which he'd just used to take a photo of a magazine cover, on the tabletop Surface. The computer automatically pulled the photo, which showed an art sculpture, onto the screen of Surface. When Mundie touched it and chose from a menu, the computer located the magazine's Web site and displayed the article about the piece of art.
Touching the name of the gallery where the sculpture is being shown opened a photograph on the vertical screen of the street outside of the shop. Mundie touched the front door of the shop on the screen and an interactive photograph of the interior popped up.
He could then virtually browse around the gallery, looking at each piece of work in three dimensions, turning them to view them from different angles. He could also open pages with additional information about the artist, including videos.
Separately, he gave a rough video demonstration of Microsoft employees interacting with a virtual assistant to schedule a shuttle bus on campus. The company has scores of shuttles that employees can order to take them from one building to the next on the massive campus.
In the video, the employees approach a computer that displays an image of a receptionist. She greets them and asks if they'd like to reserve a shuttle. The employees and receptionist speak naturally to set up the reservation.
Mundie didn't say when the technology he showed off might appear on the market. He said he wanted to demonstrate the applications that could emerge with more natural interfaces.
"This is what the natural user interface is really going to be about," Mundie said. "It's not just your receptionist. You should be able to interact with your computer in a much more natural way. This is just the tip of the iceberg."
While the concept of natural user interface began with Gates, it will be up to Mundie and Ray Ozzie, Microsoft's chief software architect, to continue supporting the idea. Mundie and Ozzie have split duties previously handled by Gates. Mundie will focus on external matters, including strategies in emerging economies, and think as far out as 20 years into the future. Ozzie will be internally focused on the company and think about the nearer-term time horizon of under five years.
While Gates isn't spending all his time at Microsoft anymore, he will serve as nonexecutive chairman and participate in select projects।
Reference :

Microsoft Still Working on Online Branding Issues

Microsoft is working on a way to combine all of its online services under a single Web page, rather than continue on with its confusing online presence involving multiple brands and services reached through many different portals.
The change will happen along with a streamlining of the branding, said Microsoft CEO Steve Ballmer, in response to a question from an analyst during the company's annual analyst get-together on Thursday.
Microsoft has been criticized for the introduction of the Live brand, which didn't fully replace the MSN brand, resulting in a confusing online presence for the company.
"The real question isn't the brand question. The real question is, If you type 'www.,' what does that page look like?" Ballmer said. "I don't think it's going to be a blank page."
Currently, Microsoft's page and Google's site are essentially blank apart from the search bar. While the search bar will remain central, the new main site will also include other components. "Given the monetization model, it has to predominantly feature search. At the same time, it should have a range of content tailored and directed at you," Ballmer said.
The analyst who asked the question suggested that if Microsoft integrates its online branding and has a single showcase for all of its services, the company might be able to show Internet users that its services are worth using. Microsoft trails Google by a wide margin in search and has struggled to keep up with competitors in some other online services.
The branding issue and the restructuring of a main page were put on "short-term hold" during Microsoft's discussions with Yahoo, because if that deal had gone through, the problem would have been different, Ballmer said.
In early June, a Microsoft executive speaking at an advertising conference in Seattle first mentioned that the company was planning to "fix" its online branding problem. Reference :

Microsoft to Buy Data-warehouse Appliance Vendor

Microsoft continues its shopping spree to bolster its SQL Server database platform to make it more suitable for large-scale enterprise deployments. On Thursday the company said it plans to buy DATAllegro, a privately held maker of data-warehouse appliances.
The terms of the deal, which comes on the heels of one announced last week to purchase data-quality technology vendor Zoomix, were not disclosed. Microsoft will retain most of the 93 DATAllegro employees, who will continue to work out of their existing office in Aliso Viejo, California.
DATAllegro provides data-warehouse appliances, which combine data-storage functions with business-analytics software. According to the company, its appliances allow companies to rapidly query large volumes of data and have the flexibility and scalability enterprises need, but at a cost-effective price.
Microsoft plans to use DATAllegro's technology to extend the capabilities of SQL Server for enterprise customers, making it easier and more cost-effective for them to manage and mine data. The company is expected to reveal more details about what it plans to do with DATAllegro's technology in October at its Business Intelligence Conference, according to IDC analyst Dan Vesset.
Microsoft may run into some challenges when integrating DATAllegro's technology with SQL Server. One technical challenge will be to replace the open-source Ingres database that the acquired company's appliance is based on, wrote Forrester analyst James Kobielus in a research note released Thursday.
Another will be to convince customers to use SQL Server in favor of Ingres, he wrote. "Clearly, that migration to SQL Server may alienate a substantial portion of DATAllegro's existing customer base," Kobielus wrote, adding that it also will likely raise the price of Microsoft's version of DATAllegro's appliance.
However, on the plus side, Microsoft will provide what "DATAllegro has most critically lacked -- global sales, marketing and support -- "in spades," he wrote.
Managing and getting relative business intelligence from data has always been a problem for business customers, particularly large enterprises, and customers long have used data warehouses to store and manage large quantities of data.
The data-warehouse appliance market, which combines storage and management with analytics, has been growing over the past several years because it provides an all-in-one package, Kobielus wrote.
"Over the past several years, the DW [data warehouse] appliance -- a preconfigured, pre-optimized bundle of hardware and software components -- has become the predominant go-to-market approach among both established and start-up DW solution providers," he wrote.
Microsoft's purchase of DATAllegro signals that there will be more consolidation in the data-warehouse space, with large enterprise data-warehouse vendors snapping up smaller, niche players, both Kobielus and IDC's Vesset said in separate research notes.
According to Kobielus, Forrester expects that incumbent enterprise data-warehouse vendors, such as Oracle, SAP and Hewlett-Packard, will follow Microsoft in the coming year to make strategic acquisitions in the market. Other pure-play companies still up for grabs in this space include Greenplum and Dataupia, he wrote.
Microsoft expects the deal to buy DATAllegro to close at the end of this month or the beginning of the next।
Reference :

VMware to Offer Low-Fooprint ESX Hypervisor Free

To help it compete against Microsoft's low-cost Hyper-V, VMware will offer the low-cost version of its hypervisor at no cost. VMware Tuesday said it will offer the small-footprint version of its ESX virtualization software free, responding to pressure from Microsoft and other companies that are threatening VMware's lead in the virtualization market.
The next version of ESXi, which will come in about two weeks, will be available at no cost, said VMware CEO Paul Maritz on a conference call Tuesday to discuss the company's second-quarter earnings. ESXi is a basic hypervisor, which is technology that separates the OS from server hardware so multiple OSes can run virtually on one physical server.
Maritz said the move to make the already low-cost product free is part of VMware's plan to make its virtualization and network infrastructure products "as freely available to everyone in the industry" as possible as it diversifies its products beyond merely enabling virtualization. A former Microsoft executive, Maritz replaced VMware cofounder and former CEO Diane Greene, who was ousted in a sudden move two weeks ago.
VMware is facing some of its toughest competition yet as Microsoft and other companies seek to commoditize the core virtualization technology on which VMware's business was built by offering it as part of the OS.
Speaking about his "alma mater" Tuesday, Maritz called Microsoft a "formidable" competitor, but "not an invincible" one.
"I know that Microsoft can afford to play a long waiting game," he said. However, in markets where another company already has a sizable lead -- such as VMware does in virtualization -- it can be "really hard to catch [up] even for Microsoft," Maritz said.
VMware reported $456 million in revenue for its 2008 second quarter, which ended June 30. It was an increase of 54 percent from the same period last year. However, consensus estimates from Thomson Financial analysts expected the company to fare slightly better, predicting $458.6 million in revenue for the quarter.
Non-GAAP (generally accepted accounting principles) net income for the quarter was $92 million, or $0.23 per diluted share, which was in line with analyst estimates.
VMware fell more than 14 percent from a close of $37.97 to $32.50 in after-hours trading Tuesday.
VMware Tuesday said it will offer the small-footprint version of its ESX virtualization software free।
Reference :

Final Cut Server 1.1

Apple's long-awaited Final Cut Server 1.1 is a complex and powerful tool for Final Cut Pro users who have advanced requirements for organizing large amounts of footage, working with multiple editors and/or artists, or needing to automate certain production steps.
Final Cut Server is a client-server-based workflow tool that can potentially help you in three ways: in cataloging and searching assets, especially video-based assets; with version control through check-in and check-out capabilities and approval; and with automation capabilities to convert, copy, and execute scripts.
You can run the server side of Final Cut Server with a variety of hardware, from a high-end Xserve to a Mac Pro. From the main interface, you can see thumbnails, double-click to see small H.264 previews, see all the shot metadata, and drag a shot or group of shots directly into Final Cut Pro--very slick. The interface is intuitive and most features are pretty obvious and user-friendly. The software never crashed during my testing, and the documentation is up to Apple's usual high standards. I was not able to evaluate setup and installation because Apple provided a system that was preconfigured and had the software installed.
Catalog and search
Final Cut Server has powerful, flexible, and easy-to-use catalog and searching capabilities. This is useful when you have hundreds or thousands of clips for multiple projects that pull video from the same collection of footage, or one massive project for which you need better searching capabilities than those in Final Cut Pro.
The way Final Cut Server catalogs files, and the ease with which you can create detailed, customized, savable searches are major selling points. These will be the most readily usable features for most users.
Unlike other asset-management systems I've used that were aimed at Web or print production, Final Cut Server does not copy your footage or other assets and store them in a separate catalog file or database. Instead, it updates a catalog when new files are placed in folders it has been told to "watch" (more on that later). This allows you to organize your assets in exactly the way they are already working for you.
When assets are added to the catalog, either manually or via the folder-watching feature, the program will generate still and video thumbnails of the clips, and gather metadata (information about your assets) such as the shot name and any logging notes you made, all of which you can search later--even if the source files are in separate storage drives not connected to Final Cut Server. This is handy for shops that have lots of FireWire drives filled with footage.
Final Cut Server allows you to create and save custom searches similar to Finder smart searches. Here, I have saved a custom search to find all high-resolution footage shot on a Red One camera before June 25, 2008, that have the word _beach_ as a keyword.
Even better, if you've ever used the smart search feature in the Finder, you'll feel right at home in Final Cut Server--you can build searches from multiple criteria, such as all footage captured on a certain date with the word Montana in the logging notes. It'll automatically update if you copy a new file into the folder that fits those criteria. You can even drag those clips straight into Final Cut Pro from the Final Cut Server interface.
The downside is that Final Cut Server doesn't let you search all of the metadata you've captured by default. Things like the codec used, the pixel dimensions of the footage, and other useful information is cataloged but not searchable in the standard configuration. Both Final Cut Pro and the Finder have additional search criteria not readily available by default in Final Cut Server--a major omission. I spent an hour going over the documentation to figure out how to add the above as searchable criteria in the main interface, and I only partially succeeded (codec, not pixel size).
Search speed is snappy (even if cataloging and thumbnail building isn't), so Final Cut Server is sufficiently responsive when you're trying to sift through thousands of files.
Version control and reviewing
Version control consists of two things: keeping track of multiple versions of the same project file over time, and allowing you to revert to older ones; and checking-in and checking-out capabilities. For projects that require more than one editor or artist, versioning is an automated way to make sure only one person at a time works on a file.
While Final Cut Pro lets you create an Auto Save Vault (which automatically saves a copy of your project at specified intervals), every time you check out a project file from Final Cut Server, two things happen--a copy is made of the Final Cut Pro project, and the file shows up as locked in Final Cut Server--telling the team that you are the only one allowed to make changes at the moment. This prevents others from accidentally overwriting your changes--a common risk in multiuser environments.
When you check the project back in, Final Cut Server automatically saves it as a different version, even if it still has the same file name. This allows you to find and examine the version you created in the past. Checking in and checking out require more discipline from editors and artists but guarantees that you can roll back to prior versions quickly, easily, and reliably. Think of it as a more carefully nuanced version of Time Machine for your files that are associated with Final Cut Pro.
Final Cut Server can also copy all of the necessary assets for a project to a given location when you check out that project--great for taking a FireWire drive with the project to work on elsewhere--and can even create edit proxies (compressed versions of the source--smaller and easier to move around). When you're done, check the Final Cut Pro project back in, and safely delete the copy Final Cut Server made of the media--the source footage is safely where it started.
Another significant feature is the approval chain--Final Cut Server lets you indicate whether a project has been or is ready to be reviewed, or whether it was approved or rejected at any step in the production. This can be tied into the program's automation features to do useful things like copy a file to a location, convert it to a deliverable format, post it to a Web site for approval, and more.
One of the big selling points of Final Cut Server is automation--the program's ability to complete tasks for you automatically and unattended. This was the feature I was most eagerly anticipating--as I do a lot of file-based conversions, waiting for Step 3 to finish so I can get on with Step 4 of my process.
Final Cut Server can perform automated tasks by watching folders--if new files are placed in a folder you've told the program to watch, Final Cut Server can start an automated process, which can be something like running files through Apple's Compressor to convert them to a more useful format, or to multiple formats. Or perhaps you have a finished edit, and you want MPEG-2 files to burn to a DVD, or maybe convert H.264 files for the Web to three different sizes.
Since Final Cut Server uses Compressor for media conversion, you can set up a Compressor cluster (a group of machines working together to convert files faster).
You can also create scripts for complex operations, like publishing content to a custom Web page and having a client provide feedback via the Web back to Final Cut Server. The catch is, many of the advanced capabilities are contingent on custom scripting. You can use Ruby scripts, Automator scripts, and others, but it is up to you to provide or build them. That is a huge barrier for most users, and significantly limits the usefulness of the product for nonscripters if the application doesn't already do what you want it to do.
Final Cut Server is great for consultants and developers--the package is inexpensive for enterprise-level software, but to really get the most out of it, you could easily spend more on the custom integration work than on the software. Apple should provide a rich library of well-documented scripts for users as part of the shipping software. I found one script available on the Apple Web site. I found this script to be a useful starting place and I wish there were more to choose from. Apple says there will be more in the future.
Macworld's buying advice
If you are comfortable with databases and scripting, you can create some amazingly powerful and useful workflows with Final Cut Server 1.1. If you are diligent about entering useful metadata while logging footage, you'll have strong abilities to quickly sift through large amounts of footage and quickly find what you need, especially if you don't want all your media sitting in a single Final Cut Pro project. If you have projects that need multiple editors and artists--such as a feature film or TV show--the program's version-control, check-in and check-out, and review capabilities are incredibly useful.
On the other hand, if you don't have large editorial projects or multieditor projects, don't have the time or interest required for entering a lot of metadata, or aren't technically inclined or lack strong IT support, Final Cut Server may not have a lot to offer you.
While I've been eagerly anticipating Final Cut Server for a long time, now that it is here, I view it as a decent version 1.x product that does many useful things. For environments where even moderate amounts of data are being handled, Final Cut Server's capabilities for asset and job tracking, as well as automated conversions, can be immediately useful. However, I'll only recommend it to clients who have the technical chops to take advantage of its advanced features if it doesn't do what they need right out of the box.
[Mike Curtis runs HD for Indies, a consultancy and Web site focused on digital workflow solutions for high-quality content creation using high definition technology।]
Reference :

Open Typed URLs in New Tabs

Everyone is probably aware of how to open links in new tabs in your favorite browser--just Command-click the link, and it will open in a new tab, instead of replacing your current window's contents. (This is a great way, for example, to browse the Macworld news page and open all the stories you'd like to read without losing the news page.)
But what if you're typing a URL, and you'd like it to open in a new tab? If you're using Safari, Camino, or OmniWeb, all you need to do is hold down the Command key prior to pressing Return after typing the URL. All three browsers will open the typed URL in a new tab, and will respect your preferences settings relative to new tab behavior--if you've got the preferences set to open new tabs in the background, then that's what will happen, and vice versa. (You can use this same trick in the Google search box, too.)
Firefox, however, is different. Instead of using Command-Return, you'll need to hold down Option and then press Return. This will force the URL to open in a new tab. Unfortunately, that new tab will open in the foreground, regardless of your preference settings. As I much prefer new tabs to open in the background (so I can continue reading the foreground tab), I went looking for a solution. I found that solution in a Firefox add-on called Tab Mix Plus. Unfortunately, that linked version won't work in Firefox 3, so I had to do further digging. Over in the Tab Mix Plus forums, this thread contains links to developer builds that work with Firefox 3.
As of today, the top post in that forum links to Tab Mix Plus Dev Build 0.3.7pre.080721, which works fine on my Firefox 3.0.1 installation. To install it, just click the Dev-Build link in that first forum post. Firefox will display a message stating that installation has been blocked. Click Allow to go ahead and install the extension, then restart Firefox.
Once Firefox restarts, open its preferences and select the Tabs tab. Click the Tab Mix Plus Options button, then click on the Tab Focus tab in the new window that opens. In the section labeled 'Focus/Select tabs that open from,' remove the checkmark next to Address Bar. If you'd like to use this same trick in the Google search box, also remove the checkmark next to Search Bar. When done, click OK and close Firefox's preferences panel.
From now on, when you press Option-Return after typing in the URL bar (or search box), Firefox will open the resulting web page (or search results) in a new background tab।
Reference :

Microsoft SQL Server 2008 RTM on 31. July?

Odd Kristoffersen: When will SQL Server 2008 ship? “When it’s ready,” which has been Microsoft’s standard answer for the release to manufacturing (RTM) date of pretty much every Microsoft product, and certainly for all its SQL Server products। But I was talking to a Microsoft representative the other day regarding implementation of Dynamics AX 2009 and we are eager to use SQL 2008 in this deployment to take advantage of the database compression that will give great improvments regarding throughput, scalability and response time. I was then told that the scheduled release date now is 31. July.Any followp up questions regarding the release date of SQL 2008 was answered with the usual; "I don't know, I can not say"So anyway we can wait 6 more days to see if this will come true, because the benefits of deploying on 2008 plattform will be so great that it's worth the while.
Reference :

Microsoft Announces Reorganization of Windows and Online Services Business

Platforms & Services Division to Split Into Two Groups and Report to CEO Steve Ballmer.
REDMOND, Wash. — July 23, 2008 — Microsoft Corp. today announced that the Platforms & Services Division (PSD) will be split into two groups: Windows/Windows Live and Online Services, with both groups reporting directly to CEO Steve Ballmer. Microsoft also announced that PSD President Kevin Johnson will be leaving the company. Johnson will work to ensure a smooth transition.
“Kevin has built a supremely talented organization and laid the foundation for the future success of Windows and our Online Services Business. This new structure will give us more agility and focus in two very competitive arenas,” Ballmer said. “It has been a pleasure to work with Kevin, and we wish him well in the future.”
Effective immediately, senior vice presidents Steven Sinofsky, Jon DeVaan and Bill Veghte will report directly to Ballmer to lead Windows/Windows Live. The Windows organization recently announced strong annual sales, with more than 180 million copies of Windows Vista sold globally, and it has driven more than 100 million installs of its Windows Live suite. The organization’s innovation pipeline includes a new version of Windows Internet Explorer, the next version of Windows and the next generation of the Windows Live product suite.
In the Online Services Business, Microsoft will create a new senior lead position and will conduct a search that will span internal and external candidates. In the meantime, Senior Vice President Satya Nadella will continue to lead Microsoft’s search, MSN and ad platform engineering efforts. Microsoft recently announced a strategy to redefine search through innovations in the user experience and business models. As an example, the company’s cashback search program, announced in May, is already generating strong momentum among online shoppers and advertisers.
In addition, Senior Vice President Brian McAndrews will continue to lead the Advertiser & Publisher Solutions Group (APS). APS has great momentum, having signed more than 100 new publisher deals in the past year. McAndrews will continue to focus on the display advertising opportunity for Microsoft, driving execution and integration of advertising assets, including recent acquisitions such as Massive Inc., Navic Networks, ScreenTonic SA and YaData Ltd.
“Our Windows business is firing on all cylinders,” Ballmer said. “We see tremendous opportunity in search and advertising, and we have a clear strategy for investing in success today and growth in the future.”
“Microsoft is a special place and presents opportunity to so many,” Johnson said. “I have been so fortunate to have experienced 16 amazing years of building Microsoft’s business, learning from great leaders in the company and working with phenomenally talented people.”
Founded in 1975, Microsoft (Nasdaq “MSFT”) is the worldwide leader in software, services and solutions that help people and businesses realize their full potential.
Note to editors: If you are interested in viewing additional information on Microsoft, please visit the Microsoft Web page at on Microsoft's corporate information pages. Web links, telephone numbers and titles were correct at time of publication, but may since have changed. For additional assistance, journalists and analysts may contact Microsoft's Rapid Response Team or other appropriate contacts listed at।
Reference :

Microsoft offers first hints at anti-Apple marketing blitz for Vista

Microsoft this week offered a window into the first phase of a mega million dollar advertising campaign designed to clear up 'misconceptions' about the quality of its Windows Vista operating system exacerbated by in-your-face marketing efforts on the part of longtime rival Apple.The first series of ads in the campaign were reportedly met with rave reviews last week when they were previewed at Microsoft’s employees-only Global Exchange conference. "[I] got goosebumps - just, wow," said one insider who was privy to the preview.While those ads have yet to surface publicly, the first instances of an associated Web campaign spotted by ZDNet suggests the Redmond-based software giant will attempt to alter the perception of Vista by liking its critics to proponents of flat earth theory. Instead of responding directly to Apple's influential "I'm a Mac, I'm a PC" global campaign, an ad that appears on the company's website (below) recalls that "at one point, everyone thought the Earth was flat." It links to a recently established information base on the state of Windows Vista, which attempts to "clear up some confusion and lingering misunderstandings" about the XP successor while simultaneously admitting to some early missteps. "We know a few of you were disappointed by your early encounter. Printers didn't work. Games felt sluggish. You told us—loudly at times—that the latest Windows wasn't always living up to your high expectations for a Microsoft product," the company said. "Well, we've been taking notes and addressing issues."What follows are a series of common questions on the part of would-be Vista adopters, such as "Why do I keep reading that Windows Vista won't work with my hardware or software?," along with answers.
"We know that's what some people are saying on the Internet," the company said, opting not to mention Apple by name. "And in its early days, Windows Vista did experience some compatibility problems. But thanks to our industry partners' efforts during the past 18 months, here's where things stand today."According to Microsoft, Vista now supports nearly 77,000 hardware products, runs 98 of the top 100 consumer software programs, and works with all of the leading small business applications.The security of Vista is another matter which Microsoft believes to have been exploited unfairly by Apple's marketing tactics. As such, the company's references to the Mac maker and its Leopard operating system in this regard are not as subtle. "Windows Vista has fewer than half the security vulnerabilities of Windows XP," Microsoft said. "It's also 60% less likely to be infected by spyware or malware than Windows XP SP2. And in early 2008, Windows Vista was shown to have 89% fewer vulnerabilities than MacOS X 10.5, making it the most secure Windows release to date."In total, it's believed that Microsoft plans to spend more than $300 million on its pro Vista, anti Apple marketing blitz before the dust settles।
Reference :

Seven Things IT Should Be Doing (but Isn't)

Pity the poor IT managers.
They're expected to know what their end-users want need, even if their end-users can't articulate it themselves. They're under constant pressure to develop new skills (like AJAX) while maintaining old ones (COBOL, anyone?), and to not only maintain line-of-business apps but jazz them up to meet the expectations of the Facebook generation.
They've got to deal with a data tsunami that increases more than 30 percent per year while simultaneously protecting the company jewels from devastating data spills. They're required to gird for disasters of unknown proportions and figure out how to keep the business going in the aftermath.
[ Think you've got it bad, check out "The 7 dirtiest jobs in IT." ]
And, oh yeah -- they need to take a few business finance courses. In their copious spare time, of course.
Tough job? You bet. But in this Web 2.0-centric data-engorged world, it's the cost of doing business. Do them well and both you and your company will succeed.
Here are seven (more) things to add to your must-do list. Ignore them at your peril.
[ See also our slideshow "Seven things IT should be doing (but isn't)" ]

No 1: Follow your users
don't have to hire a gumshoe to find out how people actually use technology inside your company's walls, but it couldn't hurt.
"IT folks should shadow their users to find out what they really do for a living," says Jonathan Ezor, assistant professor of law and technology at Touro Law Center in Central Islip, N.Y. IT personnel often complain users don't understand enough about technology, but Ezor says the opposite is also true -- IT recommendations don't reflect the real world of users.
[ Or you can cut down on the to-do list by putting end-users to work. See "Guerrilla IT: How to stop worrying and learn to love your superusers" ]
Case in point: pervasive wireless Net access. Great for many companies, but a potential disaster in Ezor's law classrooms. So starting next fall, some of the school's IT managers will begin auditing Ezor's classes, to get a feel for what student life is like.
Even better: Shoulder-surf your biggest customers. It's the best way to figure out what works and what needs fixing, says Richard Rabins, co-founder of database maker Alpha Software.
When Alpha builds custom apps for its biggest clients, it puts a development team inside the offices of the departments that will ultimately be using the software.
"Having developers feel the actual pain is very powerful," says Rabins। "If our IT folks can walk in the shoes of users and understand their business processes, that gives us a real competitive edge."

No 2: Embrace Web 2।0।
Like it or not, we live in a Facebook/Twitter/iPhone world. And if your line-of-business apps don't sport the latest Web service features, you could lose your best young employees to a company that does.
"Many IT organizations are not as ready for Web 2.0 as they need to be," says David McFarlane, COO for Nexaweb Technologies, which makes software and services for modernizing legacy applications. "They need to prepare for the millennium generation -- the audience that has only heard the legend of the DOS interface and expects to have a ubiquitous iPhone-like experience every time they touch a computer or related device."
[ For tips on giving your apps a Web 2.0 makeover, see "Rich Web development tools bring bling to the browser." ]
Your youngest, most tech-savvy employees expect to interact with the system from any browser, whether it's on their laptop or a cell phone, and access virtually any data from anywhere. If you don't provide that, somebody else will.
"Part of the reason to embrace Web 2.0 is to show your employees that your company is forward-thinking and willing to do things differently," says Jim Lanzalotto, vice president of Yoh, a technology talent and outsourcing firm. "It sounds bizarre, but if you don't do enough to energize your employees, they may lose interest in you as a company."
Your customers also have increasingly high expectations, adds Nexaweb's McFarlane.
"They expect to be part of the extended enterprise," McFarlane says। "If they order a part from you, they expect to be able to track where it is in the process, when it was dispatched, where it is now. If they file an insurance claim, they expect to participate, to take photos of the damage and upload them to the file. Companies can't departmentalize these things anymore. You need to deliver rich, compelling, engaging applications for your customers as well."

No। 3: Tame the data monster।
Bad, incomplete, or unusable data has been the bane of thousands of enterprises. Even data that's perfectly usable in one form may be useless in a broader context -- which leads to poor decision-making.
Tony Fisher, CEO of data-quality specialists DataFlux, recalls the time he was working with the CEO of a Fortune 10 company who was concerned about the aging population of the company's workforce.
[ How important is data cleansing and validation? Read "The perils of dirty data" and beware. ]
"His first question was, 'How many employees do we have and where are they?'" says Fisher. "But the best estimate he could get was between 90,000 and 115,000. He was never able to drill down to age of the population or its distribution."
The problem: It was huge global company with 120 locations, each with an HR system that treated data just a little bit differently. The data was sufficient for the needs of the local organizations, says Fisher, but they couldn't integrate it across different systems -- an increasingly common dilemma for many enterprises.
"Better data makes for a better business," Fisher says. "You need to make sure data isn't just accurate but is also fit-for-purpose, so it can drive business initiatives."
And this should be done sooner rather than later because the data deluge is only growing. Studies have found that the amount of data generated per year is growing by 35 to 40 percent, notes Sean Morris, sales director at Digitech Systems, an enterprise content management provider.
"IT folks need to take a closer look at how they are capturing, encrypting, and storing all the data their companies generate, including e-mail, invoices, and contracts," says Morris। "Companies with a solid ECM strategy will have a competitive advantage going forward, and IT can be positioned to be the hero."

No 4: Flirt with disaster।
Many organizations think they have a disaster recovery plan in place, only to find out too late it's inadequate. Or they think that simply backing up their data is enough, with no way to keep the biz running -- and the revenue flowing -- while they attempt to recover.
"You'd be surprised how much downtime happens -- as well as lost goodwill from clients and vendors -- when you lose your data," notes Dimitri Miaoulis, vice president of Baroan Technologies, which provides 24/7 tech support for small businesses. "Every business needs a continuity plan that describes how it will continue to function, not only with technology but mail, fax, deliveries, phone calls, where people go, and what do they do."
But simply having a plan isn't enough -- it needs to make sense in real-world situations, says John Biglin, CEO of Interphase Systems, a management and technology consultancy.
"We had one client, a multi-billion-dollar HR services company, with a disaster recovery manual four inches thick," Biglin says. "On its Exchange Server Configuration page, there was one sentence: 'See company intranet for the latest information.' If the network at their corporate headquarters went down, they'd be completely hosed."
Blank backup sets, crumbling storage media, and recovery plans that haven't been updated since 9/11 -- all are recipes for an even bigger disaster. Large firms may have a comprehensive continuity plan but fail to update it regularly or do dry runs to see if they actually can recover and keep operating, says Biglin.
"Even customers who have a plan rarely take the time to validate that it works," he adds। "Unless you've tested it and can show that it truly works, you don't have a plan."

No 5: Capture old knowledge (before it disappears)।
Odds are you have at least some of your key business data written in an ancient computer language, locked away on old iron, or buried inside the brains of aging coders. You need to capture that knowledge and bring it into the service-oriented century, or have a staff of semi-retired COBOL programmers on hand to draw from.
"The biggest thing IT isn't doing is capturing the 'corporate knowledge/culture' that their retiring IT staff has," says Robert Rosen, CIO of a U.S. government agency. "It's all the stuff not captured that will come back to bite IT when something fails and they say, 'Joe always knew how to do that.'"
It's not just the graybeards, says Venkat S. Devraj, co-founder and CTO of datacenter automation firm Stratavia. Everyone's day-to-day tasks need to be documented so that business processes continue to flow. "Otherwise, when an employee is on vacation, gets sick, is promoted, or leaves the company, the IP [intellectual property] is not available to get the job done with the same level of quality and predictability," he says.
The bigger, more important step: Become less dependent on aging code, says McFarlane, whose Nexaweb Advance software explores aging code, documents the business logic and rules embedded within it, and transforms it into a modern Java application that can be delivered over the Web.
"Enterprises must learn how to be less dependent on the shrinking number of folks who are well versed in the applications running the business like COBOL, PowerBuilder, and Oracle Forms," McFarlane says। "Most CIOs won't admit it, but not only do many of them not know how these applications work, they don't know if these applications work. All they know is they've got 30 million lines of COBOL code and no COBOL programmers, institutional knowledge, or documentation. They need to go in and liberate their intellectual property from the bowels of legacy systems."

No। 6: Plug data leaks.
Data spills are almost inevitable, but you can minimize risk and mitigate damage by keeping an eye on orphaned accounts, lax oversight of permissions, and mobile data access.
A survey of more than 850 executives by security firm Symark revealed that 42 percent of all businesses have no idea how many orphaned accounts exist on their networks, and nearly one-third have no procedure for removing them. Worse, many organizations are lax about policing who's allowed to access what data on the network.
[ Beware the top 10 security land mines, and keep up with the latest security trends on Roger Grimes' Security Adviser blog. ]
"It's not uncommon for folders on file shares to have access control permissions allowing everyone to access the data inside it," says Johnnie Konstantas, vice president of marketing at Varonis Systems, a data governance solutions provider. "Global access to folders should be removed and replaced with rules that give access to the explicit groups that need it."
Konstantas says IT departments need to maintain a current list of everyone who "owns" each data store and review or revoke permissions on a regular basis.
Lax permissions policies, coupled with the growing threat from rogue mobile devices, raise the possibility of accidental data spills and deliberate data breaches, notes Ben Halpert, an information security researcher and consultant.
"The current security model is inadequate for dealing with today's threats," he says. "When it comes to mobile security, every organization needs to recognize certain realities. The first is that you can't stop mobile device proliferation. The second is that user awareness alone is ineffective. And third, point solutions like encryption will only shift the target."
A December 2007 survey conducted by the Ponemon Institute found that nearly 40 percent of employees have reported losing a mobile device containing company data, and that more than half copied sensitive data to USB drives despite company policies forbidding the practice.
Halpert says enterprises need to implement an overarching strategy for mobile security, taking into account technology, user populations, and processes.
"While the majority of your workforce does not have malicious intent, those involved in social engineering are masters of the human condition and will attain the information they desire," Halpert warns।

No। 7: Follow the money.
If IT wants to overcome its reputation as a corporate money suck, tech managers need to learn a few things about the bottom line -- including how to translate long-term goals into quarterly results for the CFO.
"Having financial knowledge is important, especially when you've got a $50 million IT budget that can easily spiral out of control," says Interphase Systems' Biglin. "The CIO can't approve every invoice. We find IT directors managing multimillion-dollar projects who don't know what costs to capitalize and which ones to expense. If you don't understand the difference, it's easy to wind up a year down the road where something has to be reclassified. It can really impact companies who report their numbers to Wall Street."
Basic concepts -- such as the difference between cash flow and profits -- need to extend throughout the IT organization, says Joe Knight, co-author of "Financial Intelligence for IT Professionals: What You really Need To Know about The Numbers."
"I think everybody in the IT department needs to understand how projects are made, why they're important, and the future benefits they will bring to the company," says Knight। "If you can speak the language of finance and present your IT case in financially astute way, you'll not only make better decisions but you'll also be able to drive your decision through the organization."

Reference :

Nasser Hajloo
a Persian Graphic Designer , Web Designer and Web Developer

Subscribe feeds via e-mail
Subscribe in my preferred RSS reader

Advertise on this site Sponsored links

Labels And Tags



All My Feeds

Computer And Technology News Blog
Hajloo's Daily Note Blog
Development World Blog
Iran ITIL - ITSM Center Blog
Khatmikhi Press Blog
Khatmikhi Blog
Mac OS X in Practice Blog

Subscribe feeds rss Recent Comments


My authority on technorati
Add this blog to your faves