Friday, June 12, 2009

E-Ink A revolutionary technology still waiting to be tapped

The recent release of Amazon e-reader kindle has left everyone awed and prompted industry leaders to release similar devices.Apart from ebook readers e-ink is also used in manufacturing of various displays such as flexible readers,hoardings,mobile displays etc.But how many of us know about this highly power efficient technology.Below is an excerpt from the official website of e-ink

The E Ink coorporation was founded in 1997 by Joseph Jacobson, a professor in the MIT Media Lab.
Electronic ink is a proprietary material that is processed into a film for integration into electronic displays. Although revolutionary in concept, electronic ink is a straightforward fusion of chemistry, physics and electronics to create this new material. The principal components of electronic ink are millions of tiny microcapsules, about the diameter of a human hair. In one incarnation, each microcapsule contains positively charged white particles and negatively charged black particles suspended in a clear fluid. When a negative electric field is applied, the white particles move to the top of the microcapsule where they become visible to the user. This makes the surface appear white at that spot. At the same time, an opposite electric field pulls the black particles to the bottom of the microcapsules where they are hidden. By reversing this process, the black particles appear at the top of the capsule, which now makes the surface appear dark at that spot.



To form an E Ink electronic display, the ink is printed onto a sheet of plastic film that is laminated to a layer of circuitry. The circuitry forms a pattern of pixels that can then be controlled by a display driver. These microcapsules are suspended in a liquid "carrier medium" allowing them to be printed using existing screen printing processes onto virtually any surface, including glass, plastic, fabric and even paper. Ultimately electronic ink will permit most any surface to become a display, bringing information out of the confines of traditional devices and into the world around us.

Wednesday, June 3, 2009

Bing is out -Google watch out




Microsofts' answer to Google is out.The all new windows search engine launched on June 1st.Quite attractive ,google like menus, a customizable background and adequate speed - that is what one feels when using Bing for the first time.Since the interface is google like users get easily accustomed to it.The image search is beautifully designed.An image search returns a set of results which can be viewed in different zoom modes.On hovering the mouse over an image the image gets zoomed and the zoomed image displays the file name,website information and image dimension.There are also options to search similar images and send feedback in this zoom mode.On clicking an image ,the page got loaded faster than google images.And overall we get a vista feel with smooth transitions and fadings between transitions.
Microsoft has made a copy of google in generating revenue via searches.On searching any keyword,we find that ads related to the keyword appear to the right of the page just as in google.
One of the most attractive feature is the video search.The search page displays thumbnails of the videos in the search result.On moving the mouse over a thumbnail a preview of the video is displayed.Cool feature.Finally we get a feel that there is something missing in google search.But there is very serious drawback in this feature.Within hours of its launch ,many visitors found that if the word porn is searched in video search mode ,then a list of porn thumbnails get displayed.You can watch preview of each porn thumbnail by using your mouse.The thumbnail video gets played even if the site to which it is linked is blocked.So parents be careful,if you see your children hooked to bing,they may be using this nice feature to explore censored stuff!!.
Overall the bing has turned out to be a quality product from a market leader like Microsoft.Finally they have acheived something worth for all the hardwork they have done over the years.
So kudos to the bing team at Microsoft.
Goto www.bing.com for your bing experience and start binging!!

Sunday, May 31, 2009

PSP Go photos leaked






The much hyped Next generation Sony PSP gets leaked.Recently there were rumours that Sony is about to launch a UMD less PSP with the name Go.All these rumours are now turning into reality.The leaked photos show a PSP all round with no visible keys near the screen,a slide out key pad and bluetooth connectivity.Its also being said that the PSP comes with internal 16GB memory. .The PSP Go will be selling alongside PSP 3000 so the proud PSP owners neednt worry.But I think the traditional PSP owners will find the sliding keypad less comfortable.The keypad design and key positioning is better suited to Nintendo Gameboy fans.So Nintendo better watch out

Photos from www.engadget.com

Thursday, May 21, 2009

Blue Ray about to become outdated

Researchers at Swinburne University of Australia have figured out a way to store almost 1.6TB of data on to an Optical Media,that is you would be able to store almost 300 DVDs onto the new Disc.Calling the method "five-dimensional optical recording," the technique employs nanometer-scale particles of gold.

Tuesday, May 19, 2009

Age of Computational search coming




18 May 2009 saw the launch of Wolfram Alpha,the online computational search engine.
Web users can submit customized questions to the service, and Wolfram Alpha will try to work out the answer on the fly. The chance that a healthy 35-year-old woman will contract heart disease in the next 10 years? One in 167. The temperature in Washington, D.C., during the July 1976 bicentennial? An average of 74 degrees.

For questions like these, Google and Wikipedia, perhaps the two best known online reference tools, would search through vast databases of existing Web pages hoping for a match.

Not so with Wolfram Alpha. "We're not using the things people have written down on the Web," said Stephen Wolfram, the project's creator and the founder of Wolfram Research Inc., which is based in Champaign, Ill. "We're trying to use the actual corpus of human knowledge to compute specific answers."
To do that, Wolfram and his team of human curators have equipped their system with a wide array of mathematical equations, as well as 10 terabytes of data from thousands of sources: scientific journals, encyclopedias, government repositories and any other source the company feels is credible. That generally doesn't include user-created websites.

How much data is 10 terabytes? Ask Wolfram Alpha: It'll tell you that's about half the text-based content held by the Library of Congress.

And there's more to come.

Adding more data and computational capability is an endless process, Wolfram said. "The main thing we have to do is to work with experts in every possible domain."

Whether all that specific knowledge will translate into advertising dollars remains to be seen. Some analysts are skeptical about the site's potential to become a Google-like thoroughfare for online consumption.

Most search revenue comes from people doing commerce-related searches, said Douglas Clinton, an analyst at investment firm Piper Jaffray Cos. "You're not going to want an answer from Wolfram Alpha's computer about what the best digital camera is, because there's not really an algorithmic answer to a question like that."

As a much-hyped entrant into the knowledge search market, Wolfram Alpha has not escaped comparisons to Google and speculation about whether it could steal some of the search giant's massive market share.

But their mission statements make it clear that the two services are not identical.

Google famously hopes to "organize the world's information and make it universally accessible and useful."

The focus of Wolfram Alpha, on the other hand, is to "make it possible to compute whatever can be computed about anything."

Lofty hopes, but neither is there yet.

Wolfram Alpha can display the molecular structure of the solvent acetone. It can list recent earthquakes near most U.S. cities. And it can tell you the rate of inflation in Tanzania.

Yet it gets tripped up on a question as simple as "What time is it?"

As Wolfram himself points out, making the engine smarter is not just a matter of shoveling in more data. Even when the answer already exists in the database, the software may simply be unable to understand the question.

"What time is it in California," for instance, yields the correct result.

Half the battle, then, is teaching the program to parse human language so it knows what it's being asked to do.

But as rough as it may seem now, Wolfram Alpha looks to be the leading edge of a newer, smarter crop of search engines.

It's the use of so-called semantic technologies, where computers grapple with concepts and simple learning, that may define the next generation of Web services.

Does that mean artificial intelligence? Not quite yet, said James Hendler, a professor of computer science at New York's Rensselaer Polytechnic Institute.

"Computers are getting very good at the sort of powerful learning that comes from recognizing patterns in very large sets of data," he said. "But they still haven't gotten at all good at figuring out the very general, intuitive, complex things that make us human."

Source:LAtimes

Tuesday, May 12, 2009

New H.264 Codec to make HD the defacto display standard



We've already got HD in places that the cast of Step by Step would've sworn was never possible way back when, but eASIC is far from satisfied. To that end, it's introducing a new H.264 codec aimed to bring high-def capabilities to all manners of devices, including (but certainly not limited to) toys, baby monitors, public transportation, wireless video surveillance and wireless webcams. The highly integrated eDV9200 is said to "dramatically lower the cost of entry into the high-definition video market, enabling a new class of low-cost applications to fully leverage the benefits offered by HD technology." Best of all, these guys aren't just blowing smoke, as the chip -- which captures streaming data directly from a CMOS sensor, compresses it, and transfers it to a host system or to a variety of storage devices -- is priced at just $4.99 each in volume.

Source:www.engadget.com

Image Spam on the rise

Image-based spam is on the rise once again after hitting near-extinction late last year. Ralf Iffert and Holly Stewart of IBM's X-Force team detailed the phenomenon in a blog post last week, noting that the techniques used in the latest waves of image spam aren't any different than that seen during its height in 2006 and 2007, and that most spam blockers should be able to catch them. Still, the new rise in this old spamming practice indicates that spammers are once again pulling out all the stops to drum up business.

According to the two researchers, image spam saw its heyday in 2006 and 2007 when it got as high as almost 45 percent of all spam. It began to tank, however, in the second half of 2007, with 2008 practically putting a nail in the image spam coffin. The spamming method had dropped to only five percent of all spam in October of 2008 before the notorious McColo shutdown, subsequently taking image spam down to less than one percent of all spam in November.

This was apparently the calm before the storm, however. Image-based spam first hit five to 10 percent of all spam in March of 2009 before skyrocketing to 15-22 percent in April. Iffert and Stewart note that—unlike a couple of years ago—it seems to focus more on drugs and pharmaceuticals instead of stock trading (most likely due to the current financial crisis). Most don't contain any clickable Web links and rely on the user to manually type in a URL into the browser.

Why in the world would spammers decide to resurrect this technique, especially when it requires so much interaction on the user's end? "Perhaps they are trying to mask their URLs through these images ," wrote the researchers. "In their trial run near the end of March, did they see that some anti-spam systems were losing their edge when it came to image spam? We don’t think so. Are they simply running out of new ideas and rehashing old techniques? Maybe."

Because of the mysterious resurgence of image spam, Iffert and Stewart wonder whether other old 'n' busted spamming techniques will be coming back next, like PDF or MP3 spam. "Have we somehow hit a plateau of spam techniques? Who knows?" they wrote. "We can tell you that from the monitoring perspective, it all feels a bit strange. It's like sitting down to watch the storyline progress in your favorite TV show only to find that the directors have inexplicably substituted an 80's-style montage in its place."

Source:ComputerWorld

Monday, May 11, 2009

Google granted floating data center patent





The future of data centers appears to be a move from the land to the sea, with power coming from the movement of the water and cooling coming directly from the ocean. Google was granted a patent for a floating data center this week, allowing it to license out the technology to third parties if it should so choose.

Google's application for a "Water-based data center" patent was filed in February of 2007 and published late last year. It describes "a floating platform-mounted computer data center comprising a plurality of computing units, a sea-based electrical generator in electrical connection with the plurality of computing units, and one or more sea-water cooling units for providing cooling to the plurality of computing units."

The majority of the patent deals with the logistics of ship-based data centers, though it also examines the use of wave power, tidal power, and seawater for providing electricity and cooling to land-based data centers that are close enough to water.

Of course, there's nothing to stop Google from deploying a floating data center powered by conventional fuel sources, but such a vessel would be more limited by range or fuel capacity. Not only would it have to carry enough fuel to power itself, it would also have to make sure to power the systems it carries. Using a water-based generator would not only be more practical and efficient, it's also a significantly greener solution.

Despite the patent, however, Google may not be the first company to send its data centers out to sea. A Silicon Valley startup called International Data Security (IDS) announced in January of 2008 its intent to set up a fleet of data-serving cargo ships. These ships would not only come with standard storage services, but also with amenities such as private offices, overnight accommodations, and galley services. The first ship was scheduled to set sail (or rather, hang out in San Francisco's Pier 50) in April of 2008, but according to a blog post by IDS partner Silverback Migration Solutions, that plan got pushed to third quarter 2008 and we were unable to find any further information on the project.

Silverback acknowledged Google's patent in September, however, noting that IDS and Google appear to be planning different implementations of the floating data center. If that's the case, then it's likely that the two won't be stepping on each other's toes. However, if other companies decide to implement floating solutions similar to Google's in the future, they may find themselves having to pay licensing fees. Given the current economic climate, though, let's just say we don't expect to see a mass data center exodus to the ocean anytime in the near future.

Author:Jacqui Cheng
Source:www.arstechnica.com

Sunday, May 10, 2009

Mini HDMI



Mini HDMI connectors are on its way.Prototypes have come out which are based on the latest Type D spec.Its mainly targeted at cellphones, GPS systems, and other portable devices. Its only half the size of regular HDMI connector and almost the same size as micro USB connector.It employs the same 19 pins as the standard HDMI design.

Tuesday, April 21, 2009

TATA Indica to be Launched in Norway

One of the biggest car manufacturers of India ,TATA is about to launch its famed Indica car in Norway.But unlike the cars in India ,this will be an Electirc car.The model is supposedly named IndicaEV.After surprising the whole world by launching its stylish one lakh NANO car TATA is about to make its presence felt globally.

Wednesday, January 14, 2009

History's Worst Software Bugs

Last month automaker Toyota announced a recall of 160,000 of its Prius hybrid vehicles following reports of vehicle warning lights illuminating for no reason, and cars' gasoline engines stalling unexpectedly. But unlike the large-scale auto recalls of years past, the root of the Prius issue wasn't a hardware problem -- it was a programming error in the smart car's embedded code. The Prius had a software bug.

With that recall, the Prius joined the ranks of the buggy computer -- a club that began in 1945 when engineers found a moth in Panel F, Relay #70 of the Harvard Mark II system.The computer was running a test of its multiplier and adder when the engineers noticed something was wrong. The moth was trapped, removed and taped into the computer's logbook with the words: "first actual case of a bug being found."

Sixty four years later, computer bugs are still with us, and show no sign of going extinct. As the line between software and hardware blurs, coding errors are increasingly playing tricks on our daily lives. Bugs don't just inhabit our operating systems and applications -- today they lurk within our cell phones and our pacemakers, our power plants and medical equipment. And now, in our cars.

But which are the worst?

It's all too easy to come up with a list of bugs that have wreaked havoc. It's harder to rate their severity. Which is worse -- a security vulnerability that's exploited by a computer worm to shut down the internet for a few days or a typo that triggers a day-long crash of the nation's phone system? The answer depends on whether you want to make a phone call or check your e-mail.

Many people believe the worst bugs are those that cause fatalities. To be sure, there haven't been many, but cases like the Therac-25 are widely seen as warnings against the widespread deployment of software in safety critical applications. Experts who study such systems, though, warn that even though the software might kill a few people, focusing on these fatalities risks inhibiting the migration of technology into areas where smarter processing is sorely needed. In the end, they say, the lack of software might kill more people than the inevitable bugs.

What seems certain is that bugs are here to stay. Here, in chronological order, is the Wired News list of the 10 worst software bugs of all time … so far.

July 28, 1962 -- Mariner I space probe. A bug in the flight software for the Mariner 1 causes the rocket to divert from its intended path on launch. Mission control destroys the rocket over the Atlantic Ocean. The investigation into the accident discovers that a formula written on paper in pencil was improperly transcribed into computer code, causing the computer to miscalculate the rocket's trajectory.

1982 -- Soviet gas pipeline. Operatives working for the Central Intelligence Agency allegedly (.pdf) plant a bug in a Canadian computer system purchased to control the trans-Siberian gas pipeline. The Soviets had obtained the system as part of a wide-ranging effort to covertly purchase or steal sensitive U.S. technology. The CIA reportedly found out about the program and decided to make it backfire with equipment that would pass Soviet inspection and then fail once in operation. The resulting event is reportedly the largest non-nuclear explosion in the planet's history.

1985-1987 -- Therac-25 medical accelerator. A radiation therapy device malfunctions and delivers lethal radiation doses at several medical facilities. Based upon a previous design, the Therac-25 was an "improved" therapy system that could deliver two different kinds of radiation: either a low-power electron beam (beta particles) or X-rays. The Therac-25's X-rays were generated by smashing high-power electrons into a metal target positioned between the electron gun and the patient. A second "improvement" was the replacement of the older Therac-20's electromechanical safety interlocks with software control, a decision made because software was perceived to be more reliable.

What engineers didn't know was that both the 20 and the 25 were built upon an operating system that had been kludged together by a programmer with no formal training. Because of a subtle bug called a "race condition," a quick-fingered typist could accidentally configure the Therac-25 so the electron beam would fire in high-power mode but with the metal X-ray target out of position. At least five patients die; others are seriously injured.

1988 -- Buffer overflow in Berkeley Unix finger daemon. The first internet worm (the so-called Morris Worm) infects between 2,000 and 6,000 computers in less than a day by taking advantage of a buffer overflow. The specific code is a function in the standard input/output library routine called gets() designed to get a line of text over the network. Unfortunately, gets() has no provision to limit its input, and an overly large input allows the worm to take over any machine to which it can connect.

Programmers respond by attempting to stamp out the gets() function in working code, but they refuse to remove it from the C programming language's standard input/output library, where it remains to this day.

1988-1996 -- Kerberos Random Number Generator. The authors of the Kerberos security system neglect to properly "seed" the program's random number generator with a truly random seed. As a result, for eight years it is possible to trivially break into any computer that relies on Kerberos for authentication. It is unknown if this bug was ever actually exploited.

January 15, 1990 -- AT&T Network Outage. A bug in a new release of the software that controls AT&T's #4ESS long distance switches causes these mammoth computers to crash when they receive a specific message from one of their neighboring machines -- a message that the neighbors send out when they recover from a crash.

One day a switch in New York crashes and reboots, causing its neighboring switches to crash, then their neighbors' neighbors, and so on. Soon, 114 switches are crashing and rebooting every six seconds, leaving an estimated 60 thousand people without long distance service for nine hours. The fix: engineers load the previous software release.

1993 -- Intel Pentium floating point divide. A silicon error causes Intel's highly promoted Pentium chip to make mistakes when dividing floating-point numbers that occur within a specific range. For example, dividing 4195835.0/3145727.0 yields 1.33374 instead of 1.33382, an error of 0.006 percent. Although the bug affects few users, it becomes a public relations nightmare. With an estimated 3 million to 5 million defective chips in circulation, at first Intel only offers to replace Pentium chips for consumers who can prove that they need high accuracy; eventually the company relents and agrees to replace the chips for anyone who complains. The bug ultimately costs Intel $475 million.

1995/1996 -- The Ping of Death. A lack of sanity checks and error handling in the IP fragmentation reassembly code makes it possible to crash a wide variety of operating systems by sending a malformed "ping" packet from anywhere on the internet. Most obviously affected are computers running Windows, which lock up and display the so-called "blue screen of death" when they receive these packets. But the attack also affects many Macintosh and Unix systems as well.

June 4, 1996 -- Ariane 5 Flight 501. Working code for the Ariane 4 rocket is reused in the Ariane 5, but the Ariane 5's faster engines trigger a bug in an arithmetic routine inside the rocket's flight computer. The error is in the code that converts a 64-bit floating-point number to a 16-bit signed integer. The faster engines cause the 64-bit numbers to be larger in the Ariane 5 than in the Ariane 4, triggering an overflow condition that results in the flight computer crashing.

First Flight 501's backup computer crashes, followed 0.05 seconds later by a crash of the primary computer. As a result of these crashed computers, the rocket's primary processor overpowers the rocket's engines and causes the rocket to disintegrate 40 seconds after launch.

November 2000 -- National Cancer Institute, Panama City. In a series of accidents, therapy planning software created by Multidata Systems International, a U.S. firm, miscalculates the proper dosage of radiation for patients undergoing radiation therapy.

Multidata's software allows a radiation therapist to draw on a computer screen the placement of metal shields called "blocks" designed to protect healthy tissue from the radiation. But the software will only allow technicians to use four shielding blocks, and the Panamanian doctors wish to use five.

The doctors discover that they can trick the software by drawing all five blocks as a single large block with a hole in the middle. What the doctors don't realize is that the Multidata software gives different answers in this configuration depending on how the hole is drawn: draw it in one direction and the correct dose is calculated, draw in another direction and the software recommends twice the necessary exposure.

At least eight patients die, while another 20 receive overdoses likely to cause significant health problems. The physicians, who were legally required to double-check the computer's calculations by hand, are indicted for murder.