Sunday, May 31, 2009

PSP Go photos leaked






The much hyped Next generation Sony PSP gets leaked.Recently there were rumours that Sony is about to launch a UMD less PSP with the name Go.All these rumours are now turning into reality.The leaked photos show a PSP all round with no visible keys near the screen,a slide out key pad and bluetooth connectivity.Its also being said that the PSP comes with internal 16GB memory. .The PSP Go will be selling alongside PSP 3000 so the proud PSP owners neednt worry.But I think the traditional PSP owners will find the sliding keypad less comfortable.The keypad design and key positioning is better suited to Nintendo Gameboy fans.So Nintendo better watch out

Photos from www.engadget.com

Thursday, May 21, 2009

Blue Ray about to become outdated

Researchers at Swinburne University of Australia have figured out a way to store almost 1.6TB of data on to an Optical Media,that is you would be able to store almost 300 DVDs onto the new Disc.Calling the method "five-dimensional optical recording," the technique employs nanometer-scale particles of gold.

Tuesday, May 19, 2009

Age of Computational search coming




18 May 2009 saw the launch of Wolfram Alpha,the online computational search engine.
Web users can submit customized questions to the service, and Wolfram Alpha will try to work out the answer on the fly. The chance that a healthy 35-year-old woman will contract heart disease in the next 10 years? One in 167. The temperature in Washington, D.C., during the July 1976 bicentennial? An average of 74 degrees.

For questions like these, Google and Wikipedia, perhaps the two best known online reference tools, would search through vast databases of existing Web pages hoping for a match.

Not so with Wolfram Alpha. "We're not using the things people have written down on the Web," said Stephen Wolfram, the project's creator and the founder of Wolfram Research Inc., which is based in Champaign, Ill. "We're trying to use the actual corpus of human knowledge to compute specific answers."
To do that, Wolfram and his team of human curators have equipped their system with a wide array of mathematical equations, as well as 10 terabytes of data from thousands of sources: scientific journals, encyclopedias, government repositories and any other source the company feels is credible. That generally doesn't include user-created websites.

How much data is 10 terabytes? Ask Wolfram Alpha: It'll tell you that's about half the text-based content held by the Library of Congress.

And there's more to come.

Adding more data and computational capability is an endless process, Wolfram said. "The main thing we have to do is to work with experts in every possible domain."

Whether all that specific knowledge will translate into advertising dollars remains to be seen. Some analysts are skeptical about the site's potential to become a Google-like thoroughfare for online consumption.

Most search revenue comes from people doing commerce-related searches, said Douglas Clinton, an analyst at investment firm Piper Jaffray Cos. "You're not going to want an answer from Wolfram Alpha's computer about what the best digital camera is, because there's not really an algorithmic answer to a question like that."

As a much-hyped entrant into the knowledge search market, Wolfram Alpha has not escaped comparisons to Google and speculation about whether it could steal some of the search giant's massive market share.

But their mission statements make it clear that the two services are not identical.

Google famously hopes to "organize the world's information and make it universally accessible and useful."

The focus of Wolfram Alpha, on the other hand, is to "make it possible to compute whatever can be computed about anything."

Lofty hopes, but neither is there yet.

Wolfram Alpha can display the molecular structure of the solvent acetone. It can list recent earthquakes near most U.S. cities. And it can tell you the rate of inflation in Tanzania.

Yet it gets tripped up on a question as simple as "What time is it?"

As Wolfram himself points out, making the engine smarter is not just a matter of shoveling in more data. Even when the answer already exists in the database, the software may simply be unable to understand the question.

"What time is it in California," for instance, yields the correct result.

Half the battle, then, is teaching the program to parse human language so it knows what it's being asked to do.

But as rough as it may seem now, Wolfram Alpha looks to be the leading edge of a newer, smarter crop of search engines.

It's the use of so-called semantic technologies, where computers grapple with concepts and simple learning, that may define the next generation of Web services.

Does that mean artificial intelligence? Not quite yet, said James Hendler, a professor of computer science at New York's Rensselaer Polytechnic Institute.

"Computers are getting very good at the sort of powerful learning that comes from recognizing patterns in very large sets of data," he said. "But they still haven't gotten at all good at figuring out the very general, intuitive, complex things that make us human."

Source:LAtimes

Tuesday, May 12, 2009

New H.264 Codec to make HD the defacto display standard



We've already got HD in places that the cast of Step by Step would've sworn was never possible way back when, but eASIC is far from satisfied. To that end, it's introducing a new H.264 codec aimed to bring high-def capabilities to all manners of devices, including (but certainly not limited to) toys, baby monitors, public transportation, wireless video surveillance and wireless webcams. The highly integrated eDV9200 is said to "dramatically lower the cost of entry into the high-definition video market, enabling a new class of low-cost applications to fully leverage the benefits offered by HD technology." Best of all, these guys aren't just blowing smoke, as the chip -- which captures streaming data directly from a CMOS sensor, compresses it, and transfers it to a host system or to a variety of storage devices -- is priced at just $4.99 each in volume.

Source:www.engadget.com

Image Spam on the rise

Image-based spam is on the rise once again after hitting near-extinction late last year. Ralf Iffert and Holly Stewart of IBM's X-Force team detailed the phenomenon in a blog post last week, noting that the techniques used in the latest waves of image spam aren't any different than that seen during its height in 2006 and 2007, and that most spam blockers should be able to catch them. Still, the new rise in this old spamming practice indicates that spammers are once again pulling out all the stops to drum up business.

According to the two researchers, image spam saw its heyday in 2006 and 2007 when it got as high as almost 45 percent of all spam. It began to tank, however, in the second half of 2007, with 2008 practically putting a nail in the image spam coffin. The spamming method had dropped to only five percent of all spam in October of 2008 before the notorious McColo shutdown, subsequently taking image spam down to less than one percent of all spam in November.

This was apparently the calm before the storm, however. Image-based spam first hit five to 10 percent of all spam in March of 2009 before skyrocketing to 15-22 percent in April. Iffert and Stewart note that—unlike a couple of years ago—it seems to focus more on drugs and pharmaceuticals instead of stock trading (most likely due to the current financial crisis). Most don't contain any clickable Web links and rely on the user to manually type in a URL into the browser.

Why in the world would spammers decide to resurrect this technique, especially when it requires so much interaction on the user's end? "Perhaps they are trying to mask their URLs through these images ," wrote the researchers. "In their trial run near the end of March, did they see that some anti-spam systems were losing their edge when it came to image spam? We don’t think so. Are they simply running out of new ideas and rehashing old techniques? Maybe."

Because of the mysterious resurgence of image spam, Iffert and Stewart wonder whether other old 'n' busted spamming techniques will be coming back next, like PDF or MP3 spam. "Have we somehow hit a plateau of spam techniques? Who knows?" they wrote. "We can tell you that from the monitoring perspective, it all feels a bit strange. It's like sitting down to watch the storyline progress in your favorite TV show only to find that the directors have inexplicably substituted an 80's-style montage in its place."

Source:ComputerWorld

Monday, May 11, 2009

Google granted floating data center patent





The future of data centers appears to be a move from the land to the sea, with power coming from the movement of the water and cooling coming directly from the ocean. Google was granted a patent for a floating data center this week, allowing it to license out the technology to third parties if it should so choose.

Google's application for a "Water-based data center" patent was filed in February of 2007 and published late last year. It describes "a floating platform-mounted computer data center comprising a plurality of computing units, a sea-based electrical generator in electrical connection with the plurality of computing units, and one or more sea-water cooling units for providing cooling to the plurality of computing units."

The majority of the patent deals with the logistics of ship-based data centers, though it also examines the use of wave power, tidal power, and seawater for providing electricity and cooling to land-based data centers that are close enough to water.

Of course, there's nothing to stop Google from deploying a floating data center powered by conventional fuel sources, but such a vessel would be more limited by range or fuel capacity. Not only would it have to carry enough fuel to power itself, it would also have to make sure to power the systems it carries. Using a water-based generator would not only be more practical and efficient, it's also a significantly greener solution.

Despite the patent, however, Google may not be the first company to send its data centers out to sea. A Silicon Valley startup called International Data Security (IDS) announced in January of 2008 its intent to set up a fleet of data-serving cargo ships. These ships would not only come with standard storage services, but also with amenities such as private offices, overnight accommodations, and galley services. The first ship was scheduled to set sail (or rather, hang out in San Francisco's Pier 50) in April of 2008, but according to a blog post by IDS partner Silverback Migration Solutions, that plan got pushed to third quarter 2008 and we were unable to find any further information on the project.

Silverback acknowledged Google's patent in September, however, noting that IDS and Google appear to be planning different implementations of the floating data center. If that's the case, then it's likely that the two won't be stepping on each other's toes. However, if other companies decide to implement floating solutions similar to Google's in the future, they may find themselves having to pay licensing fees. Given the current economic climate, though, let's just say we don't expect to see a mass data center exodus to the ocean anytime in the near future.

Author:Jacqui Cheng
Source:www.arstechnica.com

Sunday, May 10, 2009

Mini HDMI



Mini HDMI connectors are on its way.Prototypes have come out which are based on the latest Type D spec.Its mainly targeted at cellphones, GPS systems, and other portable devices. Its only half the size of regular HDMI connector and almost the same size as micro USB connector.It employs the same 19 pins as the standard HDMI design.