Monday 22 April 2013


windows, rumor, windows 8, start menu, start button, windows blue, windows 8.1, start scre
We first heard rumors about a possible comeback of the Start menu button in Windows 8.1 last week, but now sources speaking to The Verge have confirmed that this will indeed be the case, only it’s probably not what most detractors were hoping for. The newly reintroduced button will reportedly sit on the traditional bottom left corner, and will look near-identical to the existing Windows flag used in the Charm bar, but clicking on it will simply bring up the tile-based Start screen rather than the old Start menu.
There are already several quick ways to get back to the Start screen from the desktop. Users can just press the Windows key on their keyboard, or hover their mouse over the lower left corner of the screen until a Start screen thumbnail shows, and then click. So while there’s nothing new here functionality-wise, Microsoft apparently hopes to appease at least some of the criticism by adding a shortcut users might be more familiarized with.
To be fair, you can already do everything the Start menu allowed with the redesigned Start screen -- searching, opening recent files, quickly launching apps, jumping to the control panel and so on. But those who have been criticizing the change have an issue with having to jump back and forth between Modern UI and the desktop to do these things.
Another noteworthy change expected to arrive with the upcoming “Blue” update is the addition of a boot to desktop option. So far only hints of this have appeared on internal builds, and there’s currently no toggle to enable it through the operating system’s UI, but Microsoft is apparently working on how to add this feature -- Windows Weekly’s Paul Thurrott believes itmight be limited to Pro and Enterprise Windows 8 SKUs only.

Friday 12 April 2013


An 18-month study has concluded that searches on Bing are five times more likely to link users to malicious websites than searches through Google. The study, conducted by German independent testing firm AV-Test, noted that although strides have been made to curb malware-infested websites, they still manage to appear among the top search results for a given query.
Data for the study was collected from close to 40 million websites spanning seven different search engines. Roughly 25 percent of the results came from Bing while Google accounted for another 25 percent, we’re told. Some 13 million results were supplied via Russian search engine Yandex with the remainder provided by Baidu, Blekko, Faroo and Teoma.
The results weren’t terribly alarming as only 5,000 instances of malware were discovered across the 40 million sites – a rather small percentage. Of those 5,000, Google delivered the least followed by Bing. Yandex had a rather large number of malware hits comparatively as some 3,300 malicious links were found out of the 13 million search results.
In total, Google had only 272 malicious results out of 10 million searches while Bing returned 1,285 dangerous links across the same number of searches. Blekko only found 203 malicious results out of nearly three million results.
PCMag estimated that the odds of finding a malware-laden website via Google search is about one in 40,118. Keep in mind, however, that those odds are repeated billions of times each day so it’s inevitable that there are some dangerous results being returned to unsuspecting users.

Solid state drives could soon become a staple in the enterprise sector thanks to a recent commitment from IBM. Big Blue announced plans yesterday to invest $1 billion in the research and development of flash storage to help design and build servers, storage systems and middleware for better flash storage integration.
I’ve been preaching the benefits of flash storage in desktop and notebook systems for years as the single best upgrade you can perform to boost overall performance and reduce power consumption, noise and heat generation. It’s many of these same attributes that IBM will benefit from due to the rise in data use from smartphone and tablet users.
Solid state technology is able to process data much faster and more reliably than traditional hard drives with mechanical internals. IBM systems and technology group general manager Ambuj Goyal said the economics and performance of flash storage are now at a point where the technology can have a revolutionary impact on enterprise, especially with transaction-intensive applications.
In addition to the investment, IBM also launched a new FlashSystem line built with businesses in mind. These systems, based on technology from Texas Memory Systems, use flash storage exclusively with capacities reaching up to 24 terabytes.
Clients running business analytics applications will realize energy reduction up to 85 percent while systems installed in cloud-based data centers will use up to 80 percent less energy. We’re told that Sprint Nextel is one of the first companies to sign up with plans to install nine flash-based systems at their data center.

Nvidia’s recent investor day was the perfect platform for CEO Jen-Hsun Huang to showcase the firm’s next generation mobile chip. Known as Kepler Mobile, the upcoming mobile hardware is derived from the high-end Kepler architecture that Nvidia uses for current-generation notebook and desktop GPUs.
Huang said Nvidia made a huge investment to port the PC hardware to mobile which required them to shrink the size of the chip and reduce the power consumption from dozens of watts to hundreds of milliwatts. Energy consumption and the subsequent cooling requirements that come with it have been one of the key reasons why mobile graphics have lagged behind their PC counterparts.
Huang said Nvidia wants to get multiple years ahead of the competition, making a strategic decision to delay other projects in order to develop Kepler Mobile at a faster rate. The chip will be able to play high-end PC titles and may even be capable of running DirectX 11 - a technology that would deliver advanced shadows and lighting, among other eye candy.
A video shown to attendees compared what the latest iPad powered by the A6X is capable of. Nvidia made sure to point out that 40 percent of the A6X’s silicon consists of GPUs. Next, onlookers were shown a demonstration of Battlefield 3 running on the company’s new mobile platform. The latter featured technology like HDR lighting and particle effects while the iPad’s graphics looked “vintage 1999”, Nvidia teased.
No word yet on when we can expect to see Kepler Mobile graphics show up, however.

Apple has agreed to pay a $53 million settlement on a class action accusing the company of denying repairs for some of its mobile devices while they were still under warranty. The payout covers the original iPhone, the iPhone 3G, and the 3GS, along with the first three generations of the iPod touch.
According to several lawsuits combined in San Francisco, the Cupertino-based firm had denied repairs because the liquid contact indicator tape found in the headphone jack or charging port of devices had been activated -- turning pink or red. Apple’s warranty policy explicitly excludes repairs or exchanges over water damage. But it turns out the liquid contact indicator used to detect if a device has been exposed isn’t 100% accurate, as 3M, the manufacturer itself has since indicated that humidity alone could trigger the mechanism in some cases.

Water damage indicator in iPhone 3G / 3GS
As it’s often the case in these settlements Apple will admit no wrongdoing but is willing to compensate customers who feel they’ve been wronged. Customers will receive monetary compensation using the average replacement cost of their particular device at the time as a baseline, which ranges from $160 for the 8GB variant of the original iPod touch, all the way up to $300 for the 16GB iPhone. Of course payouts are proportional to the number of claims filed, so the more people join in, the less each affected user will receive.
Under the agreement, the company will take out an ad in both USA Today and Macworld to provide the website and contact information needed to file a claim, while potentially-affected customers will be contacted directly by mail as well.  The settlement document, obtained by Wired, is expected to be filed in a San Francisco federal court and still needs to be approved by the court before it will go into effect.

verizon, feature phone, prepaid plan, lg cosmos 2, lg extravert, samsung gusto 2, samsung intensity
Verizon recently launched a new prepaid plan designed to offer significant savings for those not interested in a smartphone. The no-contract plan includes 500 anytime talk minutes in addition to unlimited text and Internet usage for just $35 per month – a solid value for someone that doesn’t spend a lot of time on their mobile device but would still like to keep it around for the occasional call or emergency.
As of writing, Verizon is offering four feature phones to use with the new plan: the LG Cosmos 2, the LG Extravert, the Samsung Gusto 2 and the Samsung Intensity 3. Each of these handsets can be had at a deep discount when activating as part of the new plan. Furthermore, a Verizon spokesperson said customers that already own select basic feature phones may be able to use those with the $35 plan as well.
There are a few stipulations to be aware of before you sign up, however. The plan doesn’t waive mobile-to-mobile calls which means if you call another Verizon customers, it’ll count against your total monthly minutes bundle. It’s also worth pointing out that users will be charged $0.25 per minute over the allotted 500 minute bundle.
In the event that 500 minutes per month simply isn’t enough, you may want to look into another recently announced prepaid plan from Verizon that offers unlimited talk, text and data for $50 per month. Alternately, regional carriers like RadioShack offer plans starting at $25 which include 300 monthly minutes.

Sales of personal computers plummeted 14 percent in the first quarter of 2013 – the worst drop in nearly 20 years according to a new report from International Data Corp. The findings are a bit breathtaking as even the firm that produced the data forecasted a drop of just 7.7 percent.
The industry as a whole sold just 76.3 million PCs during the first three months of this year. For comparison, nearly 353 million PCs were sold in 2011 – a figure that we won’t even get close to this year.
The PC industry has been suffering for quite some time thanks to the rising popularity of tablets and smartphones. No company – large or small – is impervious to the effects as even big time players like HP have been struggling to restructure their business accordingly.
Roger L. Kay, founder and president of Endpoint Technologies Associates, said the one message here is to go faster – referring to companies that are trying to transition to tablets and smartphones. He noted that some companies will only get half way through the transition process and that some will ultimately go bankrupt.
It all started around 2011 when the industry grew just two percent that year. Many hoped Windows 8 would help the slumping sales but that never happened. It’s been downhill ever since.
If there’s a shining light in it all, it’s for the consumer. Pricing is expected to drop as much as 30 percent over the next few months as manufacturers want to unload inventory before pushing new products starting in June.

There’s been no shortage of Xbox rumors as we move closer to the console’s unveiling next month -- well, technically, that’s also a rumor. Although I was hoping to steer clear of further hearsay until official details were shared, I couldn’t help but bring to your attention an updated Xbox roadmap posted by VGLeaks with new details about the hardware lineup and a clarification about the whole “always connected” madness.
If you recall, VGLeaks was previously responsible for credible leaks revealing Durango’s specs as well as a hardware overview describing an “Always On, Always Connected” design.
Never mind that the latter was mentioned in the context of being able to download updates in the background, most sites still reported it as proof or confirmation that Microsoft would be implementing online checks to block used games, even though there was no mention of this in the documents or by VGLeaks. To be fair, reputable sites like Kotaku and Edge were told as much by their own separate sources, but we remained unconvinced.

Always on, always online is not what you think

Now VGLeaks is clarifying that Durango will indeed be always online “like any other device”, but it will not be a requirement to play local content and it will not prevent playing used games. In other words, this is meant for downloading stuff like game or social network updates in the background when a net connections is available, but you will still be able to play Xbox games if your broadband is down or you take your console elsewhere.
That’s more in line with what we’ve been saying all along although it’s worth noting it’s all still unconfirmed.
The site notes that the “always online” rumors likely stemmed from development kits with components requiring network connections to be present all the time, which could explain what some inside sources were saying.

Two-SKU strategy: The Xbox Mini

In addition to making this minor but important clarification, the updated roadmap also points to a separate console with a more limited feature set known as the “Xbox Mini” -- we’ve heard about this before but it was referenced to as “Xbox TV”. In a nutshell, this is a repackaged and reoriented Xbox 360 unit to access the platform’s entertainment apps (think Apple TV competitor) and play games downloaded from Xbox Live.
Microsoft is aiming at a $150 price tag for this smaller Xbox unit and will possibly design it to be stackable atop the full-fledged “Durango” console. It will most likely lack an optical drive but it can be networked with its bigger brother to provide backwards compatibility for 360 games. The next-gen Xbox will not support older games on its own as it’s based on a different architecture than its predecessor.

Taking over your TV signal

A separate report from The Verge also claims Microsoft will introduce a feature that lets its next-generation console take a cable box signal and pass it through to the TV via HDMI, allowing it to overlay a UI and features on top of an existing TV channel or set-top box. Microsoft is reportedly seeking partnerships with content providers for this. Apparently the functionality will be tied to the full-fledged Xbox rather than the Mini version.

android, hacking, airplane, android app, hijack, flig
A security consultant by the name of Hugo Teso claims he has created an Android app called PlaneSploit that would allow him to remotely attack and hijack commercial aircraft. He recently presented his findings at the Hack in the Box security conference in Amsterdam where, among other things, he exposed the fact that a number of aviation and aircraft systems have no security in place.
Teso, a trained commercial pilot for 12 years, reiterated that the Automated Dependent Surveillance-Broadcast (ADS-B) is unencrypted and unauthenticated which can lead to passive attacks like eavesdropping or active attacks such as message jamming and injection. Furthermore, the Aircraft Communications Addressing and Reporting System (ACARS) – a service used to send text-based messages between aircraft and ground stations - also has no security.
With these vulnerabilities in mind, he used virtual planes in a lab to demonstrate his ability to hijack a plane rather than attempting to take over a real flight as that was “too dangerous and unethical.” He used ACARS to gain access to the plane’s onboard computer system and uploaded Flight Management System data.
Once in, he demonstrated how it was possible to manipulate the steering of a Boeing jet while it was in autopilot mode. The security consultant said he could cause a crash by setting the aircraft on a collision course with another jet or even give passengers a scare by dropping down the emergency oxygen masks without warning.
A pilot could thwart an attack by taking the plane out of autopilot although he pointed out that several newer systems no longer include manual controls. Some systems could be updated to patch the vulnerabilities but many legacy systems would be difficult, if not impossible, to update.

Thursday 11 April 2013

This is a special gift for you because you are a part of tech times...................................................................
To claim the gift click here.
Copy the text to Notepad and name the file as speak.vbs

Sunday 7 April 2013



apple, music, radio, streaming, pandora, pandorSpeculation about a music streaming service from Apple has been going on for quite some time. The rumored addition would presumably complement the company’s download model on iTunes, but apparently one thing holding it back all this time is Apple’s stringent demands as well as labels’ unwillingness to play ball after feeling short-changed on their previous dealings. That could change soon, according to Cnet.
A new report cites “two people familiar with the negotiations” claiming Apple is very close to striking a streaming deal with Warner Music and Universal Music. The company would still would have to sign a deal with Sony but snagging two major music labels might be enough for their launch target of summer 2013.
The new streaming service would go up against the likes of Spotify and Pandora, although Cnet’s sources say it most closely resembles the latter since Apple won’t offer on-demand listening. That said, they will throw in a few unique features compared to Pandora, such as the ability to jump back to the beginning of a song.
As far as fees are concerned Apple is flexing its muscles as the dominant player in the music business. The report claims they’re negotiating a per-stream rate that's half of what Pandora currently pays labels, with the promise of new revenue channels to make up the difference. This includes a quick way for consumers to buy a song they hear -- where Apple also takes a cut per sale -- and incorporating audio ads into the free service.
Apple and the labels are still hammering out what the revenue split for ads would be, but labels are reportedly pressing for 35 to 45 percent in order to agree to the lower per-stream revenue.
If everything comes into place, Apple’s annual Worldwide Developers Conference, which is expected to take place in June, could be the ideal stage to announce the new service within the rumored summer timeframe.

Read More...

Microsoft could be working on a connected device similar to Google’s Project Glass that would debut sometime next year according to a note to investors from analyst Brian White. What’s more, he believes that Google has made significant breakthroughs around software applications as it relates to Glass which will help usher in a new industry of wearable electronics.
The software giant filed for a patent for a wearable glasses device late last year although the device appeared to be more of an entertainment accessory to enhance the experience of sporting events and the like rather than a daily personal assistant of sorts like Google Glass.
Google Glass is by far the most anticipated wearable device this year simply because we know it exists and is coming later this year. We don’t have an exact release date but Google has said that it will be ready by the end of the year for less than $1,500.
Another hot topic in the field of wearable electronics is Apple’s rumored wristwatch, dubbed the iWatch. As usual, Cupertino hasn’t officially announced anything on the subject but most industry insiders believe the product is under development with an unveiling imminent. Samsung has also confirmed they are working on a smartwatch, too.
While gadgets like Google Glass are a hot topic among tech enthusiasts, it remains to be seen whether or not the average consumer will have any interest in wearable electronics.

Update: Microsoft's Larry "Major Nelson" Hryb has issued an official statement on behalf of the company.
"We apologize for the inappropriate comments made by an employee on Twitter yesterday. This person is not a spokesperson for Microsoft, and his personal views do not reflect the customer centric approach we take to our products or how we would communicate directly with our loyal consumers.  We are very sorry if this offended anyone, however we have not made any announcements about our product roadmap, and have no further comment on this matter."
-- Original story below.
We’ve been hearing for some time now that Microsoft’s next generation game console could require an always-on Internet connection and that games must be installed to the hard drive. We’ve been skeptical of such a requirement as recent as late last month but based on a recent tweet by Microsoft Studios creative director Adam Orth, the rumors could very well be true.
Orth took to the microblogging platform to publically proclaim that he doesn’t get the drama around having an always-on console. He points out that every device now is always-on and it’s simply the world we live in. The executive wraps up the post with #dealwithit.
As you can imagine, his post stirred up quite a bit of controversy in the Twittersphere but Orth stuck to his guns. One Twitter member said he knew of five people that didn’t have Internet access. Orth’s said those people should get with the times and get the Internet because it’s awesome.
When asked if he learned anything from Diablo III and SimCity (referring to their constant connection requirements) and that the Internet goes out occasionally, Orth shot back with a couple of analogies about electricity going out and poor mobile reception as reasons not to buy a vacuum cleaner or a mobile phone, respectively.
Microsoft has yet to publically comment on the Xbox 720 which makes Orth’s tirade especially interesting. Last we heard, Redmond was preparing to unveil their next console at a special media event next month or perhaps at E3 in June.

When Ubisoft started teasing Far Cry 3: Blood Dragon earlier this week many just assumed it was an elaborate April Fools hoax. The website for the game, which is reminiscent of the old Geocities days with brightly colored animated gifs and a couple of visitor counter applets, depicted a dystopian future set in 2007 as envisioned sometime around the ‘80s complete with an overall B-movie feel. Well, apparently it was no joke.
Today a number of screenshots have trickled online courtesy of the official Xbox.com website. If that’s not enough proof EuroGamer also points to several other recent leaks and hints, such as the fact that the game has already been rated by the Brazilian and Australian game classification boards, and a list of achievements for the game that surfaced last week describing 400G’s worth of objectives.
The game’s narrative doesn’t appear to hold many parallels with the original Far Cry 3 besides the mention of a distant island. And the visual elements… well, it’s a dramatic departure that switches the shanty towns for futuristic neon-lit laboratories. Judging by the game’s box art you can also expect cyborgs and laser-firing dinosaurs to make an appearance too. Here's the synopsis from the official website:
The year is 2007. It is the future. Earth has been ravaged by a nuclear war and new paths for peace must be found. A U.S. cyborg army may have found a solution: a powerful bioweapon on a distant island. A Mark IV Cyber Commando, Sergeant Rex Power Colt has been sent over to gather information and figure out what the hell is going on.
Awesome? Cheesy? Awesomely cheesy? We’ll have to wait a little longer to make final judgement on that. The aforementioned game ratings suggest Blood Dragon will be coming to PlayStation 3, Xbox 360, and PC, although it’s unclear if it will come in the form of a traditional expansion or as a standalone digital download.

Friday 5 April 2013


apple, imessage, encryption, dea, imessages, drug enforcement administratiAn internal government document has revealed that encryption used in Apple’s iMessage chat service has prevented Drug Enforcement Administration officials from spying on suspects’ conversations. The document, seen by CNET, cites a February 2013 criminal investigation where officials said it is impossible to intercept iMessages between two Apple devices regardless of service provider, even with a court order.
The DEA just recently became aware of the service despite the fact that iMessage launched in October 2011. According to the report, the task force noticed that not all text messages were being captured from data supplied by Verizon Wireless. It soon became evident that the suspect was using iMessage to communicate with some associates.
Apple’s iMessage, which sends messages over the Internet instead of as a traditional SMS, is the most popular encrypted chat program in history. As of last fall, the service had sent more than 300 billion messages. The free service prompted a number of wireless carriers to adjust their text messaging plans to make up for lost revenue.
Apple didn’t specifically design iMessage to circumvent government surveillance, according to senior policy analyst Christopher Soghoian from the American Civil Liberties Union. He said the government would need to perform what’s described as an active man-in-the-middle attack to intercept data. Soghoian goes on to say that the real issue is why phone companies in 2013 are still delivering unencrypted audio and text services to their users. “It’s disgraceful,” as he puts it.

Read More...
Apple’s proposed solar-powered “spaceship” campus is expected to cost an additional $2 billion more to construct than originally estimated. The total projected cost of $5 billion would make the structure more expensive to build than the new World Trade Center complex in New York City ($3.9 billion) according to five people close to the project as reported by Bloomberg.
If you recall, the late Steve Jobs personally announced plans to build the facility during a Cupertino City Council meeting in June 2011. In fact, it was his last public appearance on behalf of Apple before his death.
 
The co-founder was instrumental in designing the elaborate 176-acre complex which is expected to feature curved glass throughout with no seams, gaps or paintbrush strokes visible anywhere. What’s more, every wall, ceiling and floor is to be polished to a supernatural smoothness while the interior wood will consist of “heartwood” from the center of a specific type of maple tree.
It’s unclear whether or not all of these stringent design elements will stand as the architect behind the project, Foster + Partners, is reportedly trying to shave $1 billion from the budget. What we do know, however, is that the projected completion date has been moved back from 2015 to 2016 – validating rumors on the subject from late last year.

Apple could cancel the project altogether but that would likely be a huge mistake as A) they need the office space and B) it could be brand suicide. We’ll have to wait and see how investors react to these latest developments in the midst of declining stock value.
amd, radeon, apus, apu, simcity, never settle, a8 series, a10 seri AMD is reportedly prepared to take its "Never Settle" promotion to the next step by extending it to other product lines. According to Hardware Luxx, shoppers who opt for one of the company's APUs will be rewarded with free game codes. It's said that by the second or third week of April, certain A8 and A10 APUs will be bundled with SimCity, though details are still muddy about specifically what chips will be eligible.
Graphics card manufacturers have long attempted to entice your purchase by including free games -- a particularly sweet arrangement if you had planned to buy the promotional title anyway. Granted, some folks will always argue that they'd rather just have cash savings over a "free" game, but the bundle deals tend to be popular among those eyeing a new Radeon or GeForce and could easily sway someone's decision.
As such, AMD ramped up its efforts late last year to couple its latest graphics card series with various big name new releases. The company's "Never Settle" initiative kicked off by giving high-end Radeon buyers 20% off Medal of Honor: Warfighter as well as a free copy of Far Cry 3, Hitman: Absolution and Sleeping Dogs. At minimum, the bundle offered a $60 value and that scaled up to $170 if you bought an HD 79xx card.
It seems the company's initial push was a success as it recently announced a new offer that added Crysis 3, BioShock Infinite and Tomb Raider to the fold. Those who buy two HD 7900 series cards or an HD 7990 get free copies of all three new additions along with the previous three titles, while one card from the HD 7900 range gets you to Crysis 3 and BioShock and one HD 7800 card gets you BioShock and Tomb Raider.

Read More...
It's no secret that tablets and smartphones are gradually occupying more computing time among folks who don't need the horsepower of a full-fledged PC. Although this trend has prompted many to predict the death of PCs, it's fairer to say that high-end desktops aren't going anywhere, we're just witnessing a change of preferred platforms among standard users -- a shift that could leave Microsoft irrelevant in as little as four years, Gartner warns.
"While there will be some individuals who retain both a personal PC and a tablet, especially those who use either or both for work and play, most will be satisfied with the experience they get from a tablet as their main computing device," Gartner research VP Carolina Milanesi said. "As consumers shift their time away from their PC to tablets and smartphones, they will no longer see their PC as a device that they need to replace on a regular basis."
Considering Microsoft's dominance is primarily in PCs, declining desktop and laptop sales isn't good news for the software giant. However, the company's weak presence in the booming mobile segments presents a bigger concern. Microsoft has largely been one step behind its rivals when it comes to smartphones and tablets, and shipments of Windows-branded devices including PCs and mobile hardware will soon be dwarfed by others.
Unless something changes, Gartner predicts that shipments of Apple's devices (including iOS and Mac OS) will reach parity with Windows PC and mobile hardware -- something that reportedly hasn't happened since the 1980s. During the same period, shipments of Android devices will total about twice that of either Microsoft or Apple -- a gap that will continue to grow through 2017 when Google's OS is expected to ship on 1.5 billion devices.

"Winning in the tablet and phone space is critical for them to remain relevant in this shift," Milanesi told the Guardian. "We're talking about hardware displacement here -- but this shift also has wider implications for operating systems and apps... Android is going to get to volumes that are three times those of Windows. From a consumer perspective, the question becomes: what software do you want to have to get the widest reach on your devices?"
She also noted that many people in developing markets are more likely to have a smartphone or tablet as their first "computer," and they'll likely stick with devices that provide a smartphone-like experience. Microsoft could also face increasing trouble attracting application developers as Apple and Android greatly outpace the company's smartphone and tablet shipments, and few things will kill an outfit's mobile efforts faster than having a weak ecosystem.
It’s been roughly two months since Opera announced it was dropping its own Presto web rendering engine in favor of Webkit, the same used by Apple's Safari and Google's Chrome. The move was seen as an attempt to remain relevant in the mobile market, where Opera has usually been a strong contender, and essentially reduced the market to three major engines, with Microsoft's Trident and Mozilla's Gecko the other two.
Back then Mozilla lamented the decision noting that Webkit dominating the mobile scene would make their job of promoting web standards harder. Fast forward to today and there are two new players jumping on the scene.

Google Blink

The first comes courtesy of Google, which just announced it will be forking WebKit to create the Blink rendering browser engine. Google’s Adam Barth explained over at the Chromium blog that it wasn’t an easy decision to make, but with Webkit being an open source project responding to the interests of different customers on varying platforms, it was becoming increasingly difficult to move development apace while keeping it working for all.
Blink will still be open source, which means anyone working with WebKit can fold changes from Blink back into the main engine source. But of course the key difference here is that Google will be calling the shots going forward, so you can expect the code to be tweaked specifically for Chromium’s needs.

Eventually Blink will become the rendering engine of both the desktop and mobile versions of Google's Chrome browser, as well as the engine that drives its ChromeOS web-based operating system. Opera has also confirmed it will follow Google on this one adopting both Chromium and Blink.
It’s unclear if this means their iOS clients will be discontinued, since Apple only allows Webkit-based browsers to run on its platforms, or if they’ll keep a Webkit version alongside the new Blink-based browser for some time. But with Google controlling a majority of the browser market share on desktop and mobile combined -- 38% according to StatCounter --  maybe they feel confident enough to part ways with Apple.
There’s still work to be done before that can happen, however. Barth says the bulk of the initial work will focus on internal architectural improvements and simplification of the codebase by dropping more than 4.5 million lines of code relating to other architectures. No timeframe for the change was given.

Mozilla and Samsung's Servo

Meanwhile, the Mozilla Foundation also announced it’s working on a new web browser engine called Servo, developed in collaboration with Samsung. The companies said Servo is an attempt to rebuild the Web browser from the ground up to take advantage of tomorrow’s faster, multi-core, heterogeneous computing architectures.
Security also appears to be a key focus for Mozilla, which touts a new “safe systems language” called Rust as the basis for the new engine, which will reportedly prevent “entire classes of memory management errors that lead to crashes and security vulnerabilities.” The language is said to fill many of the same niches as C++, with efficient high-level, multi-paradigm abstractions, and precise control over hardware resources.
Mozilla isn’t talking about replacing Gecko on either its desktop of mobile browsers just yet. But partnering with a mobile heavyweight could be a major strategic win for Firefox, which was late to the game, and could suddenly end up with an enormous user base on smartphones if Samsung were to swap out the default Chrome browser.

What does this mean for web developers?

Both new engines are still at an early stage -- more-so Mozilla and Samsung’s Servo --  so in the short term this should bring little change for web developers. But in the end the introduction of a new rendering engine means more work for developers as they need to code and test for incompatibilities on a whole other platform.
Blink is likely to gain wider adoption in the short term due to Chrome’s reach, but given it will be a fork of Webkit, supporting it might not be too much of a hassle. As for Servo, the engine will still need to prove itself and garner a strong user base before developers even make the effort to support it. Mozilla isn’t really a force in the mobile web space these days, but if the Samsung partnership pans out it will add yet another variable to the mix.
It’s worth mentioning that Google has vowed to collaborate closely with other browser vendors to move the web forward and preserve the compatibility that made it a successful ecosystem. Others are a little more cynical about Google’s intentions with Blink -- check out this comical no-BS FAQ from prng.net.

3Dfx Voodoo: The Game-changer

Launched on November 1996, 3Dfx's Voodoo graphics consisted of a 3D-only card that required a VGA cable pass-through from a separate 2D card to the Voodoo, which then connected to the display.
The cards were sold by a large number of companies. Orchid Technologies was first to market with the $299 Orchid Righteous 3D, a board noted for having mechanical relays that “clicked” when the chipset was in use. Later revisions utilized solid-state relays in line with the rest of the vendors. The card was followed by Diamond Multimedia’s Monster 3D, Colormaster’s Voodoo Mania, the Canopus Pure3D, Quantum3D, Miro Hiscore, Skywell (Magic3D), and the 2theMAX Fantasy FX Power 3D.
Voodoo Graphics revolutionized personal computer graphics nearly overnight and rendered many other designs obsolete, including a vast swathe of 2D-only graphics producers. The 3D landscape in 1996 favoured S3 with around 50% of the market. That was to change soon, however. It was estimated that 3Dfx accounted for 80-85% of the 3D accelerator market during the heyday of Voodoo’s reign.

Diamond Multimedia’s Monster 3D (3dfx Voodoo1 4MB PCI)
Around that time VideoLogic had developed a tile based deferred rendering technology (TBDR) which eliminated the need for large scale Z-buffering (removing occluded/hidden pixels in the final render) by discarding all but visible geometry before texture, shading and lighting were applied to that which remained. The frame resulting from this process was sectioned into rectangular tiles, each tile with its own polygons rendered and sent to output. Polygon rendering commenced once the pixels required for the frame were calculated and polygons culled (Z-buffering only occurred at tile level). This way only a bare minimum of calculation was required.
The first two series of chips and cards were built by NEC, while Series 3 (Kyro) chips were fabricated by ST Micro. The first card was used exclusively in Compaq Presario PCs and was known as the Midas 3 (the Midas 1 and 2 were prototypes for an arcade based system project). The PCX1 and PCX2 followed as OEM parts.
The 3D landscape in 1996 favoured S3 with around 50% of the market. That was to change soon, however. It was estimated that 3Dfx accounted for 80-85% of the 3D accelerator market during the heyday of Voodoo’s reign.
Series 2 chip production initially went to Sega’s Dreamcast console, and by the time the desktop Neon 250 card hit retail in November 1999, it was brutally outclassed at its $169 price range, particularly in higher resolutions with 32-bit color.
Just before the Neon 250 became available, Rendition’s Vérité V1000 became the first card with a programmable core to render 2D + 3D graphics, by utilizing a MIPS-based RISC processor as well as the pixel pipelines. The processor was responsible for triangle setup and organizing workload for the pipelines.
Originally developed towards the end of 1995, the Vérité 1000 became one of the boards that Microsoft used to develop Direct3D. Unfortunately, the card required a motherboard chipset capable of supporting direct memory access (DMA), since the Rendition used this method to transfer data across the PCI interface. The V1000 fared well in comparison with virtually every other consumer graphics board prior to the arrival of the Voodoo Graphics, which had more than double the 3D performance. The board was relatively cheap and offered a good feature set, including edge antialiasing for the budget gamer and hardware acceleration of id Software’s Quake. Game developers, however, shied away from the DMA transfer model all too soon for Rendition’s liking.
Like 1996, 1997 proved to be another busy year in the consumer graphics industry.
ATI moved from strength to strength as they launched the Rage II, followed by the 3D Rage Pro in March. The latter was the first AGP 2x card and the first product to come out of ATI’s 3D Engineering Group formed in 1995.
ATI 3D Rage Pro
ATI 3D Rage Pro
The Pro nearly equalled the Voodoo Graphics performance in 4MB form, and outperformed the 3Dfx card when using 8MB and the AGP interface. The card improved on the Rage II’s perspective correction, along with texturing ability and trilinear filtering performance thanks to an expanded 4kB cache and added edge anti-aliasing. There was also an incorporated floating-point unit to decrease reliance on the CPU as well as hardware acceleration and display support for DVD.
All in all, the Rage Pro added greatly to ATI’s bottom line, helping the company realise a CAD$47.7 million profit on sales exceeding 600 million dollars. Much of this success came from OEM contracts, integration on consumer and server motherboards, and mobile variants. Prices for the card (usually sold as the Xpert@Work and Xpert@Play) ranged from $170 for the 2MB version, to $200-230 for the 4MB model, and $270-300 for 8MB. The 16MB edition would exceed $400.
ATI bolstered their portfolio by acquiring Tseng Labs' IP for $3 million and took in forty of the company’s engineers on December 1997. It was a bargain deal, as Tseng’s failure to integrate a RAMDAC into its cards had caused a sharp fall in sales, from $12.4 million in 1996 to $1.9 million in 1997.
3DLabs announced the revised Permedia (“Pervasive 3D”) series of boards in March 1997, built on Texas Instruments’ 350nm process instead of IBM’s for the previous revision Permedia and Permedia NT for workstations. Performance was substandard for the first, while the NT model improved somewhat thanks to the additional Delta chip for full triangle and AA setup, albeit at a $300 price tag. Permedia 2 based cards started shipping towards the end of the year, but rather than going head-to-head with the gaming heavyweights, they were marketed as semi-pro 2D cards with moderate 3D graphics ability.
A month after ATI and 3DLabs refreshed their line-ups, Nvidia replied with the RIVA 128 (Real-time Interactive Video and Animation accelerator) and added Direct3D compatibility by rendering triangular polygons.
The company maintained their association with ST Micro, who produced the chip on their new 350nm process and developed the RAMDAC and video converter. While initial drivers were problematic (notably with Unreal), the card showed enough performance in games such as Quake 2 and 3 to top many benchmark review charts.

Diamond Viper V330 PCI (Nvidia RIVA 128)
This proved to be the landmark card that Nvidia had been looking for since 1993. It was such an economic and critical success that Nvidia needed to look further afield to maintain supply, signing a manufacturing deal with TSMC to supply Riva 128ZXs alongside ST Micro. Nvidia’s 3D graphics market share at the end of 1997 was estimated at 24%,ranking second behind 3Dfx Interactive, largely due to the Riva 128/128ZX.
Nvidia’s coffers were also boosted by Sega’s funding of the NV2 as a possible graphics chip for the Dreamcast console, even though the final contract was to be awarded to NEC/VideoLogic.
Rival 3Dfx also collaborated with Sega on the project and was largely believed to be the one providing the hardware for the console until the latter terminated the contract. 3Dfx filed a $155 million lawsuit claiming it was misled by Sega into believing they were committed to using 3dfx hardware, and in turn gave them access to confidential materials relating to its graphics IP. They settled out of court for $10.5 million a year later.
3Dfx based Sega BlackBelt prototype
3Dfx based Sega BlackBelt prototype
The Dreamcast “Black Belt” project was just one facet of a busy year for 3Dfx Interactive.
Quantum3D was spun off from 3Dfx on March 31, 1997. SGI and Gemini Technology partnered with the company to work on very high-end enthusiast and professional graphics solutions leveraging on 3dfx’s new SLI (Scan Line Interleave) technology. This involved using a daughter card with a second chipset and memory connected via header, or two or more cards connected via ribbon cable in the same way that Nvidia’s SLI and AMD’s Crossfire presently utilize the concept. Once connected together, each card – or logic block in the case of single board SLI cards – contributed half the scan lines per frame to the display.
SLI also increased maximum screen resolution from 800 x 600 to 1024 x 768 pixels. An Obsidian Pro 100DB-4440 (two single cards each with an Amethyst daughter card) retailed for $2500, while a single card SLI solution like the the 100SB-4440 and 4440V required an outlay of $1895.
3dfx’s SLI (Scan Line Interleave) technology involved two or more cards connected via ribbon cable in the same way that Nvidia and AMD Crossfire presently utilize the concept.
In the summer of 1997, 3Dfx’s announced its initial public offering and launched the Voodoo Rush in an attempt to offer a single card with 2D and 3D capability. The end product, however, unable to use the proposed Rampage chip ended up as a cut down Voodoo. The card’s SST-1 chip handled Glide API games, while a sub-standard Alliance -- or even worse Macronix chip -- handled other 3D games and 2D applications. This resulted in screen artifacting, as the 3dfx chip/memory ran at 50MHz, while the Alliance AT25 ran at 72MHz.
PURE RenditionDid you know?
TechSpot was originally Julio's personal project (a tech blog, if you will -- circa 1998). The site was named "PURE Rendition" as it was dedicated to report the latest news on the Rendition Vérité 3D chips. Eventually it evolved into "3D Spotlight" as we moved on to cover the entire 3D graphics scene as well as the popular "3D" soundcards of the time. The TechSpot.com domain was acquired shortly after the 90s dot-com bubble for a handsome $200 as the original owners had no good use for it.
Things got worse as the Voodoo Rush’s framebuffer was essentially halved by being shared between the 3D and 2D chips, limiting resolution to around 512x384. Refresh rates took a nosedive as well, since the Alliance and Macronix chips were limited to 175 and 160MHz RAMDAC respectively.
Sunnyvale-based Rendition released the Vérité V2100 and V2200 shortly after the Voodoo Rush launched. The cards still couldn’t match the first Voodoo in performance, and were barely competitive with the budget-oriented Rush. The company’s R&D lagged significantly behind its competitors’, and with game developers showing little interest in the cards, these turned out to be Rendition’s last commercial graphics products.
Rendition had various other projects on the deck, including adding a Fujitsu FXG-1 geometry processor to the V2100/V2200 in a two-chip approach, which the other vendors were working to integrate into a single chip. The FXG-1 powered (and gloriously named) Hercules Thriller Conspiracy card thus remained uncompleted projects, along with the V3300 and 4400E, after Micron acquired the company in September 1998 for $93 million in the hopes of combining LSI’s embedded DRAM technology with Rendition’s graphics expertise.

Rendition Verite V2200 reference board
As feature sets and performance increased, so did the prices for graphics cards, and a number of vendors who couldn’t outgun the rising tide of ATI, Nvidia, and 3Dfx rushed in to fill the sub-$200 market.
Matrox released the Mystique (hobbled by its lack of OpenGL support) for $120-150, while S3 had the ViRGE line starting at around $120 for the base model and going up to $150 or $200 for the DX or GX respectively. S3 diversified its line to ensure a steady stream of sales by adding a mobile card with dynamic power management (ViRGE/MX), and the desktop ViRGE/GX2 with TV-Out, S-Video and assisted DVD playback.
As feature sets and performance increased, so did the prices for graphics cards, and vendors who couldn’t outgun the rising tide of ATI, Nvidia, and 3Dfx rushed in to fill the sub-$200 market.
Slotting in below these were Cirrus Logic’s Laguna 3D series, Trident’s 9750/9850, and the SiS 6326, all of which fought for gamers’ attention. For the Laguna3D, a bargain price of $99 was not enough to cover for decreased performance, poor 3D image quality and consistency issues when compared to cards in a similar price range like S3’s ViRGE VX.
Cirrus Logic left the graphics industry fairly quickly after the Laguna3D launched. Prior to that they had also offered a range of budget 16-bit color graphics adapters in the $50 bracket, most notably the Diamond SpeedStar series and the Orchid Kelvin 64.
Trident also targeted the entry-level bracket with the 3DImage 9750 in May, and an updated 9850 shortly thereafter featuring AGP 2x bus support. The 9750 was a PCI or AGP 1x card and had a variety of graphics quality and rendering issues. The 9850 had remedied some of the quirks, but texture filtering was still a hit-or-miss proposition.
SiS added their entry to the budget 3D graphics market in June with the 6326, typically priced in the $40-50 range. The card offered good image quality and outperformed many other budget cards. While never a threat in the performance arena, the 6326 sold to the tune of seven million units in 1998.
A long running saga that would grow to encompass elements of myth and urban legend was born at the Assembly gaming event in June 1997, when BitBoys announced their Pyramid3D graphics to the world. The much-hyped project was a combined effort by Silicon VLSI Solutions Oy, TriTech and BitBoys.
But Pyramid3D never saw the light of day, with extensive debugging and revisions delaying the project, and TriTech losing a sound chip patent suit that eventually bankrupted the company.

Demo screenshot showing the realism that Glaze3D cards were supposed to achieve.

Bitboys would go on to announce a second design, the Glaze3D chip, on May 15, 1998. They promised class-leading performance and a planned release by the end of 1999. As the time for the grand reveal approached, BitBoys announced a revised design at SIGGRAPH99 in October that did away with the RAMBUS memory and memory controller in favour of 9MB of embedded DRAM from Infineon.
Once again, bug-hunting and manufacturing problems led to the project’s cancelation.
The company was making a reputation for itself missing release dates and essentially producing nothing but vapourware. Glaze3D was later redesigned under the codename Axe, catching up to the competition with support for DirectX 8.1. The new chip was meant to debut as the Avalanche3D card by the end of 2001 and in the meantime a third development of Glaze3D codenamed Hammer was already promising DirectX 9 support.
Prototype boards of the Avalanche3D were built with the initial run of chips, but everything came to a halt when Infineon stopped producing embedded DRAM in 2001 due to mounting financial loses. Lacking a manufacturing partner Bitboys finally gave up on desktop graphics and instead focused on mobile graphics IP.
BitBoys' exit and AMD's blunder: In May 2006, ATI acquired BitBoys for $44 million and announced the opening of a European design center. Soon after, ATI and Nokia entered into a long term strategic partnership. Just a couple of months later a then-healthy AMD announced it would acquire ATI in a grossly overvalued $5.4 billion deal. The mobile unit that included the BitBoys personnel was renamed Imageon and in a major lack of management foresight, it was sold off for $65 million to Qualcomm in January 2009. The latter continues to produce graphics under the name Adreno (an anagram of Radeon) as an integral component of the hugely popular Snapdragon SoC.
Intel launched its first (and last so far) commercial discrete 3D desktop gaming chip back in January 1998. The i740 traces its origins from a NASA flight simulation project for the Apollo space program by General Electric and later sold to Martin Marietta who merged with Lockheed three years later. The project was repurposed by Lockheed-Martin as Real3D for professional graphics products, notably the Real3D/100 and the Real3D/Pro-1000. The Sega Model 3 arcade board featured two of the Pro-1000 graphics systems.
Lockheed-Martin then formed a joint project with Intel and Chips and Technologies, named Project Aurora. Intel bought 20 per cent of Real3D in January, a month before the i740 launched. By this stage Intel had already purchased a 100 per cent of Chips and Technologies in July 1997.
The i740 combined the resources of the two distinct graphics and texture chips on the R3D/100, but was somewhat of an oddity in that Intel implemented AGP texturing, where textures were uploaded to system memory (render buffer could also be stored in RAM). Some designs used the card’s frame buffer to hold textures, with texture swapping to system RAM utilized if the frame buffer became saturated or the texture was too large to be stored in local graphics memory.
To minimize latency, Intel’s design used the AGP Direct Memory Execute (DiME) feature, which called only those textures required for rasterisation and left the rest stored in system RAM. Performance and image quality were acceptable, with performance roughly matching high-end offerings of the previous year. At $119 for the 4MB model and $149 for 8MB, pricing reflected Intel’s aggressive marketing. The i740 was sold either as Intel branded cards, Real3D StarFighter, or the Diamond Stealth II G450.

Intel740 / i740 AGP graphics board
Intel designed a revised i752 chip, but lack of interest from OEMs and the gaming community in general, caused the company to cancel commercial production. A few boards made it out of the manufacturing plant, but like the i740, instead made it into integrated graphics chipsets.
Lockheed-Martin closed down Real3D in October 1999, with the related IP being sold to Intel. Many of the staff subsequently moved over to Intel or ATI.
ATI gave the Rage Pro a makeover in February 1998, which basically consisted of renaming the card Rage Pro Turbo and delivering a set of drivers highly optimized for synthetic benchmarks. There was little else to it except a price tag bumped to $449. Drivers from beta2 onwards improved gaming performance.
ATI followed up with the Rage 128 GL and VR in August – the first of the company’s products that former Tseng Labs engineers worked on. Supply was less than ideal for the retail channel until into the new year, however, which effectively killed any chance ATI had to stamp their mark on the gaming landscape as they had done with the OEM market. Specs included 32MB of on-board RAM (16MB and 32MB on the All-In-Wonder 128 version) and an efficient memory architecture, allowing the card to push past the Nvidia TNT when screen resolution increased and 32-bit color display was used. Unfortunately for ATI, many games and the hardware fit-out of many users at the time were geared for 16-bit color. Image quality was much the same as the mainstream competition from S3 and Nvidia, yet still lagged behind that of Matrox.
Nevertheless it was enough for ATI to become the top graphics supplier in 1998 with 27% of the market, and net income of CAD$168.4 million on sales of CAD$1.15 billion.
ATI announced the acquisition of Chromatic Research in October of that year for $67 million, whose MPACT media processors found favour in many PC TV solutions -- notably Compaq and Gateway. The chips offered very good 2D graphics performance, excellent audio and MPEG2 playback, but limited 3D gaming performance and were pricey at around $200. In the end, insurmountable software issues doomed the company to a four-year lifespan.
Two months after the i740 had made a small splash in the graphics market, 3Dfx introduced the Voodoo 2. Like its predecessor, it was a 3D only solution, and while impressive it represented a complex system. The boards sported two texturing ICs, which allowed for the first example of multitexturing found in graphics cards, and as a result it used a total of three chips instead of one and combined 2D/3D capabilities like competing cards.
GLQuake showcased running on a Pentium MMX 225 MHz and 3Dfx Voodoo 2 graphics
Quantum3D’s implementations of the Voodoo 2 included the Obsidian2 X-24 as a single SLI card that could be paired with a 2D daughter card, the single slot SLI SB200/200SBi with 24MB of EDO RAM, and the Mercury Heavy Metal, which featured four 200SBi SLI boards connected via a controller board (AAlchemy) that served the function of SLI bridges found in today’s multi-GPU card setups.
The latter was a professional graphics solution intended for visual simulators, and as such it carried a hefty $9,999 price tag, while requiring an Intel BX or GX server board with four contiguous PCI slots.

Four 200SBi SLI boards connected via AAlchemy controller board.

The Voodoo Banshee was announced in June 1998 but it didn't hit retail for another three months. The card married the 2D portion of the still AWOL Rampage chipset to a single texture mapping unit (TMU), so while 3dfx could now offer a single chip with 2D and 3D capability at a much reduced production cost, the Banshee fell behind substantially when compared to the Voodoo 2’s ability of rendering multi textured polygons.
The revolution that 3dfx had ushered in three years earlier was now passing it by.
In raw 3D performance, the Voodoo 2 had no equal, but the competition was gaining ground fast. Amid increasing competition from ATI and Nvidia, 3dfx looked to retain a higher profit line by marketing and selling the boards themselves, something that was previously handled by a lengthy list of board partners. To this end, 3dfx purchased STB Systems on 15 December for $141 million in stock, but the venture proved a giant misstep as quality and cost of manufacture from the foundry used by the company (Juarez) could not compete with the Taiwanese (TSMC) foundries used by Nvidia. Nor could it compete with ATI’s Taiwanese foundry partner UMC.
Many of 3dfx’s former partners formed ties with Nvidia instead.
Amid increasing competition from ATI and Nvidia, 3dfx looked to retain a higher profit line by marketing and selling the boards themselves.
Compounding 3dfx’s mounting pressure in the marketplace, March 23 saw the launch of Nvidia’s Riva TNT -- which stood for TwiN Texel (rather than the explosive). Adding a second parallel pixel pipeline to the Riva’s design doubled the pixel fillrate and rendering speed, as well as a prodigious (for 1998) 16MB of SDR memory -- the Voodoo 2’s 8-16MB of RAM was of the slower EDO variety. While a strong contender, its performance was tamed as a result of its own complexity -- an eight million transistor chip on TSMC’s 350nm process could not run at Nvidia’s original 125MHz core/memory frequency due to heat, and thus it shipped with a 90MHz clock instead. This was a sizeable 28% reduction, enough to ensure that the Voodoo 2 barely remained the performance leader, largely due to Glide.
Even with the reduced specification, the TNT was an impressive card. Its AGP 2x interface allowed for gaming at 1600 x 1200 and 32-bit color rendering with a 24-bit Z-buffer (image depth representation). This was a huge improvement over the Voodoo 2’s 16-bit color support and 16-bit Z-buffer. The TNT traded blows with the Voodoo 2 and Banshee, offering a better feature set, better scaling with CPU clock speed, excellent AGP texturing and better 2D performance. The card didn’t ship in any meaningful quantities until September.
Not everything was going Nvidia’s way, at least not initially.
SGI filed a lawsuit against them on April 9 alleging patent infringement over texture mapping. The resulting settlement in July 1999 gave Nvidia access to SGI’s professional graphics portfolio, while SGI terminated their own graphics team and turned their low-level graphics team over to Nvidia. This virtual giveaway of IP is generally regarded as one of the main reasons why SGI roared into bankruptcy at breakneck speed.
With the major players in the market dominating the media coverage for the first few months of the year, June and July spotlighted two of the fading lights of the industry.
On June 16, Number Nine launched their Revolution IV card.
The card proved unable to match the strides in 3D performance Nvidia and ATI were making and as a result the company attempted to reinforce their position in the 2D productivity market instead.
SGI flat panel bundle
SGI flat panel bundled
with the Revolution IV-FP
Number Nine had always favoured 2D performance over allocating resources to 3D technology, and found itself hemmed in in both markets by gaming cards such as Nvidia’s TNT. The company decided to exploit the one real weakness afflicting most dedicated gaming cards: high display resolution at 32-bit color.
To this end, Number Nine added a 36-pin OpenLDI connector to the Revolution IV-FP that would connect to an SGI flat panel screen bundled with the card. The 17.3” SGI 1600SW (1600x1024) plus Revolution IV-FP package initially retailed for $2795.
This was Number Nine’s last homemade card as they went back to selling S3 and Nvidia products. The company’s assets were acquired by S3 in December 1999 and sold to engineers of Number Nine’s original design team, who formed Silicon Spectrum in 2002.
S3 announced the Savage3D at the 1998 E3 Expo and – unlike the TNT and Voodoo Banshee – the card arrived in retail shortly. The penalty for the speedy introduction was half-baked drivers, however. OpenGL games were particularly affected, to the extent that S3 supplied a mini OpenGL driver solely for Quake games.
S3’s original specification called for a 125MHz clock for the card, but yields and heat output caused the final shipping part to be clocked at 90-110MHz – many review magazines and websites still received the higher clocked 125MHz pre-production samples. A Savage3D Supercharged part at 120MHz was released later on, and Hercules and STB sold the Terminator BEAST and Nitro 3200, respectively, clocked at 120/125MHz. Even though OpenGL emulation and DirectX performance was held back by driver support, the sub-$100 pricing for the reference boards, as well as acceptable performance in gaming and video playback brought in some sales.
Between 1997 and 1998 the number of graphics vendors who left the industry rose. Among them were Cirrus Logic, Macronix, Alliance Semiconductor, Dynamic Pictures (sold to 3DLabs), Tseng Labs, Chromatic Research (both bought by ATI), Rendition (sold to Micron), AccelGraphics (bought by Evans & Sutherland), and Chips and Technologies (engulfed by Intel).
The gulf between the have’s and have not’s became even more obvious in 1999.
January saw the release of the SiS 300, a budget business-machine graphics card. The Sis 300 offered minimal 3D performance in the context of 1999, and 2D wasn’t a match for most of SiS’s competitors in the retail market. A single pixel pipeline saw to that. Luckily for SiS, OEM’s had no qualms, since the card had enough feature checkboxes to satisfy them: a 128-bit memory bus (64-bit in the SiS 305 revision), 32-bit color support, DirectX 6.0 (DX7 for the 305), multitexturing, TV-out , and hardware MPEG2 decoding.
The SiS 315 followed in December 2000, which added a 256-bit memory bus, DirectX 8 support, full screen AA, a second pixel pipeline, a transform and lighting engine, DVD video motion compensation and DVI support. Performance was generally in the region of a GeForce 2 MX200. The same 315 chip formed the basis of SiS’s 650 chipset for Socket 478 boards (Pentium 4) in September 2001, and the SiS552 system on a chip in 2003.
Gaming on a budget: SiS 315 card running Unreal Tournament 2003
Besides SiS’ offerings, the budget-minded continued to have a substantial range of offerings to choose from. Among them was the Trident Blade 3D (~$65), whose passable 3D performance (if spotty driver support) was generally on par with Intel’s i740.
Buoyed by this, Trident went on to release Blade 3D Turbo with clocks boosted from 110MHz to 135MHz, which helped it keep pace with Intel’s revised i752. Adding to Trident’s woes, their integrated graphics association with VIA was about to come to an abrupt halt when VIA acquired S3 Graphics in April 2000.
Trident’s core graphics business was then largely dependent upon high volume, low priced chips, predominantly in the mobile sector. The Blade 3D Turbo was refined into the Blade T16, T64 (143MHz), and XP (166MHz). But Trident’s 3D developments were happening at a much slower pace than the market in general. So much so, that even a much delayed budget offering like the SiS 315 handily disposed of the company’s new cards. Trident’s graphics division was sold to SiS’s XGI subsidiary in June 2003.
The S3 Savage4 was a step up in performance from the SiS and Trident offerings. The card had been announced in February, with retail availability from May and with a price of $100-130 depending on whether it had 16 or 32MB of onboard memory. S3’s texture compression introduced with the Savage3D, ensured that even with a limited 64-bit memory bus, textures up to 2048x2048 bytes could be accommodated.

Diamond Viper II Z200 (S3 Savage4)
The Savage4 became S3’s first card capable of supporting multitexturing, and the first card to support the AGP 4x interface. But not even improved drivers and a reasonable feature set could offset the fact that it struggled to attain the performance level of the previous generation of cards from 3dfx, Nvidia and ATI. This cycle was repeated at the end of the year when the Savage 2000 launched. The card achieved better than parity with the TNT2 and Matrox G400 at 1024x768 and below, but 1280x1024/1600x1200 was a different story.
The first of 3dfx’s Voodoo3 series arrived in March, backed by an extensive television and print advertising campaign, a new logo – now with a small “d” – and vivid box art. The long awaited Rampage chipset still hadn’t arrived, so the boards sported much the same architecture, if somewhat tweaked in the Avenger chipset. They remained hamstrung by reliance upon 16-bit color, 256x256 texture support limitation, and a lack of hardware-based transform and lighting (T&L). These factors were starting to become crucial with game developers and 3dfx continued to disappoint by not delivering on their architectural and feature-set promises.
In what seems a time-honoured tradition, 3dfx blamed an earthquake for its failing fortunes, although this didn’t really impact ATI and Nvidia that much. In another sign of 3dfx’s mounting troubles, the company announced in December that their Glide proprietary graphics API would finally be available as open source, at a time when DirectX and OpenGL continued to gain traction with game developers.
March also saw Nvidia release the Riva TNT2, including the first of its Ultra branded boards with faster core and memory speeds, while Matrox unveiled the G400 series.
In a sign of 3dfx’s mounting troubles, the company announced that their Glide graphics API would be released as open source, as DirectX and OpenGL continued to gain traction with game developers
The TNT2 utilized TSMC’s 250nm process and managed to deliver the performance Nvidia had hoped for with the original TNT. It comprehensively outperformed the Voodoo 3, the only exceptions were applications utilizing AMD’s 3DNow! CPU instruction extension in conjunction with OpenGL. Keeping up with with 3dfx and Matrox, the TNT2 included DVI output for flat panel displays.
Meanwhile, the Matrox G400 managed to outperform both the Voodoo 3 and TNT2 for the most part, although OpenGL support still noticeably lagged behind. At $199-229, the card represented excellent value for money from a performance, image quality and feature set standpoint. The ability to drive two monitors via twin display controllers (called DualHead by Matrox) started a multi-monitor support trend for the company. The secondary monitor in this case was limited to 1280x1024 resolution.
The G400 also introduced Environment Mapped Bump Mapping (EMBM), which provided better texture representation. For those with slightly deeper pockets, a higher clocked G400 MAX at $250 ensured you had the fastest consumer card in the market until GeForce 256 DDR-based boards such as the Creative Labs 3D Blaster Annihilator Pro hit the store shelves in early 2000.
Matrox concentrated on the professional market from this point forward, with a brief return to the gaming market in 2002 with the Parhelia. Triple monitor support was not enough to offset inferior gaming performance and the new wave of DirectX 9.0 compatible hardware.
Matrox G400 Tech Demo EMBM
By the time the smoke had cleared from the 3dfx, Nvidia, and Matrox launches, 3DLabs slipped in the long awaited Permedia 3 Create!. The card had been announced months ago and was targeted at the professional user with an interest in gaming. As such, 3DLabs prioritized 2D functionality over 3D, leveraging their professional graphics expertise which had been acquired in July 1998 from Dynamic Pictures, designers of the superlative Oxygen range of workstation cards.
Unfortunately for 3DLabs, what was important for workstation graphics tended to be complex polygon modelling – usually at the expense of texture fill rate. This was pretty much the reverse of gaming card requirements, where texturing and eye candy took precedence over elaborate wire frame modelling.
Overpriced and outperformed by TNT2 and Voodoo 3 in gaming scenarios, and not far enough ahead in workstation tasks to differentiate it from the competition, the Permedia 3 represented the last attempt by 3DLabs to build a card with gaming in mind. From that point on, 3DLabs would concentrate their efforts on the GLINT R3 and R4 based Oxygen cards; ranging from the $299 VX-1 to the $1499 GVX 420, while the Wildcat range (such as the $2499 Wildcat II -5110) were still based on Intense3D’s ParaScale graphics processors from the acquisition of Intense3D from Intergraph in July 2000. 3DLabs would begin to integrate their own graphics into the Wildcat series from 2002, when Creative Technology bought the company with their P9 and P10 processors.
The company left the desktop market in 2006 and shifted its focus to media orientated graphics as the division merged with Creative’s SoC group, renamed ZiiLabs -- sold to Intel in November 2012.
ATI’s strides had been somewhat incremental since the Rage 128’s debut. In late 1998 the company added AGP 4x support and a clock boost to the Rage 128 to release the Pro variant of the card, which also featured video capture and TV-Out options. The Rage 128 Pro’s gaming performance was broadly equal to Nvidia’s TNT2, but fell way short of the TNT2 Ultra, something ATI intended to remedy with Project Aurora.

ATI's Rage Fury MAXX combined two Rage 128 Pro chips on a single board

When it became apparent ATI would not have a chip capable of winning the performance race, the project changed tack and was realized as the Rage Fury MAXX, which comprised two Rage 128 Pro’s on the same PCB. Specification numbers were impressive, with the two chips each taking responsibility for rendering alternate frames and essentially halving the gaming workload between them. In practice, while the card bested the previous generation offerings, it wasn’t a match for the S3 Savage 2000 and would fall consistently behind the upcoming GeForce 256 DDR. The latter was only slightly more expensive at $279 versus ATI’s $249.
Nvidia's GeForce 256 was the first graphics chip to actually be called a GPU, based on the addition of a hardware-based transformation and lighting engine (T&L).
Less than two months after the Rage Fury MAXX announcement, Nvidia announced the GeForce 256 SDR on October 1, followed in February 2000 by the DDR version. It would be the first card to use this form of RAM. A 23 million transistor chip built on TSMC’s 220nm process, this was the first graphics chip to actually be called a GPU (Graphics Processing Unit), based on the addition of a transformation and lighting engine (TnL or T&L).
This engine allowed the graphics chip to undertake the heavily floating-point intensive calculations of transforming the 3D objects and scenes – and their associated lighting – into the 2D representation of the rendered image. Previously, this computation was undertaken by the CPU, which could easily bottleneck with the workload, and tended to limit available detail.
Nvidia Grass Demo (GeForce 256)
The GeForce 256’s status as the first to incorporate programmable pixel shaders with the use of T&L has long been the subject of debate. That’s because a number of designs also incorporated T&L either at the prototype stage (Rendition VéritéV4400, BitBoys Pyramid3D, 3dfx Rampage), at a level approaching irrelevancy (3DLabs GLINT, Matrox G400’s WARP), or through a separate on-board chip (Hercules Thriller Conspiracy).
None of these achieved commercial functionality, however. Moreover, by being first to adopt a four pipeline architecture, Nvidia had an inbuilt performance lead over the competition. This combined with the T&L engine enabled the company to market the GeForce 256 as a professional workstation card.
A month after the desktop variant became available, Nvidia announced their first range of professional workstation Quadro cards, the SGI VPro V3 and VR3, based on the GeForce 256. The cards leveraged SGI’s graphics technology Nvidia had gained access to through a cross-license agreement signed in July 1999.
Nvidia’s $41 million profit for the year on revenue of $374.5 million comfortably eclipsed 1998’s figures of $4.1 million and $158.2 million, respectively, and represented a huge leap from 1997’s $13.3 million in revenue. Microsoft’s initial payment of $200 million for the NV2A (the graphics core for the Xbox) added to Nvidia’s coffers, as did a $400 million secondary bond and stock offering in April.
These numbers paled next to ATI’s $1.2 billion in revenue and $160 million profit for the year, thanks to a 32% share of the graphics market. But the latter was on the verge of losing a significant portion of the OEM business thanks to Intel’s 815 series of integrated graphics.
This article is the second installment on a series of four. Next week we'll get closer to present time, when the industry takes a turn and major consolidation takes place leaving space for only two players. The beginning of the GeForce vs. Radeon era.
Part 1: (1976 - 1995) The Early Days of 3D Consumer Graphics
Part 2: (1995 - 1999) 3Dfx Voodoo: The Game-changer
Part 3: (2000 - 2005) Down to Two: The Graphics Market Consolidation
Part 4: (2006 - Present) The Modern GPU: Stream processing units a.k.a. GPGPU
 

    Blogger news

    This blog was recently the another version Tech Times. This is its new Blog and it Still continues as the Old one. Each Blog has Same Post.

    About

    This Blog is an Idea of Malayil Vivekanandan. He thought about serving People with latest Technology News and Upadtes, so that people will be more updated with their Tech Knowledge.
-