Apple swings for the fences, again.
When it comes to announcing a new product, Apple knows how to set the stage and get people excited right up until the official announcement. Nowhere was this more obvious than with the launch of the original iPhone. The fervor surrounding the mobile handset didn't settle down when the first generation iPhone was announced in early January 2007 -- it continued until the eventual release of the phone in June of that year.
Speculation on the follow-up, the "3G iPhone", has been building ever since the first generation model was revealed -- but things really started getting a bit uproarious over the past few months. Case makers began leaking dimensions for the upcoming phone, supposed "leaked" pictures of the phone were drooled over by nearly ever gadget site on the web, and leaked firmware was picked over with a fine-tooth comb.
Apple today finally announced its next generation crowd pleaser. Apple is looking to address the shortcoming of its first effort this time around and further expand its popularity (Jobs previously stated that he wants 10 million iPhones sold within the first 18 months – it already surpassed the 6 million mark during its first year).
First things firsts – the worse kept secret about the second generation iPhone is its 3G capabilities. The first gen model was widely criticized for its slow EDGE cellular broadband capabilities. Apple is now matching the competition with the iPhone 3G. The iPhone 3G has tapered look with thinner edges, solid metal buttons, a black plastic backing, flush headphone jack [thank goodness], and vastly improved audio.
The faster cellular connectivity of the iPhone 3G allow for download speeds nearly as quick as WiFi and speeds that are 2.5 times as fast as EDGE. The iPhone 3G also sports better battery life than its predecessor. The iPhone 3G now supports 2G talk time of 10 hours, 3G talk time of 5 hours, 7 hours of video, 24 hours of audio, and 5-6 hours of high-speed web browsing.
Another big addition is fully integrated GPS tracking. IPhone 3G users can now get positioning information from WiFi, cell towers, and now the hardware GPS.
Apple also confirmed early speculation that price breaks would be in store for the new lineup of iPhones. The Cupertino, California-based company confirmed today that the new 8GB iPhone will be priced at $199 with a new two-year contract when it launches July 11, while the 16GB iPhone (which will be available in white at a later date) will set you back $299 under the same terms.
The iPhone 3G will be rolled out in 22 countries on July 11 (Australia, Austria, Belgium, Canada, Denmark, Finland, France, Germany, Hong Kong, Ireland, Italy, Japan, Mexico, Netherlands, New Zealand, Norway, Portugal, Spain, Sweden, Switzerland, UK and the U.S.).
Apple first announced the Software Developer Kit (SDK) for the iPhone in early March along with the 2.0 firmware update. The SDK allows third-party manufacturers to create their own applications for the iPhone and iPod touch and upload them to the new iTunes App Store.
Developers are charged a $99 fee to publish each application to the iTunes App Store -- Apple also takes a 30% cut of the purchase price for each application sold to customers to cover hosting and processing fees. For generous developers that provide their apps for free on the iTunes App Store, the aforementioned hosting and processing fees are dropped.
Jobs noted that applications that are less than 10MB in size will be downloadable through the cell network – applications larger than 10MB will have to be downloaded through a WiFi connection of through the desktop iTunes application. Automatic updates for applications will also be pushed through to the device.
A number of applications were on display that were developed using the SDK including SEGA’s Super Monkey Ball (which will be available for $9.99 from the iTunes App Store), an integrated eBay tool complete with bidding and search, and a news reader provided for free by the Associated Press. The latter tool will send local news to you based on your location, save images, video, and text for offline viewing, and even allow you submit news as it happens.
The 2.0 software -- which is available not only for the iPhone 3G, but also to the original iPhone and iPod touch -- adds a number of new features to make the devices more corporate friendly. These include push email/calendar/contacts between an iPhone/Mac/PC via MobileMe, auto-discovery, global address lookup, Cisco IPsec VPN, Certificates and Identities, WPA2/802.11x, and remote wipe.
Other features include contacts search, bulk delete/move for emails, a new scientific calculator, and the ability to save images to the Photo Library. Microsoft PowerPoint documents are now supported as well.
IPhone users will receive the 2.0 software update for free, while iPod touch users will have to pay $9.99. The update will be available next month.
6/13/08
Apple Announces Second Generation iPhone 3G, 2.0 Software, iTunes App Store
IBM Scientists Demonstrate Liquid Cooling Process for 3D Stacked Chips
IBM scientists say stacked processors have higher power densities than nuclear reactors
As processors add processing cores, circuits, and other components, the amount of heat they generate increases exponentially. Researchers and chipmakers have found that the ability to dissipate heat will be one of the main challenges to making processors in the future.
Researchers from IBM Labs and the Fraunhofer Institute in Berlin have demonstrated a prototype 3D chip that has a liquid cooling system built-in to deliver water directly between each layer of the stacked processor.
The 3D chip stacks take components in a traditional chip that sit side-by-side and stack them atop one another in a layer. While the process allows chipmakers to create more powerful chips, with shortened interconnects between components for data to travel, the process presents significant cooling challenges. With 3D stacked chip design data has to move only 1/1000th of the distance data needs to travel on a traditional 2D chip. In addition to the shortened pathways for data to traverse, the 3D process also allows for 100 times the amount of pathways for data to flow. Both techniques together greatly increase the potential performance of a 3D stacked chip.
The issue for researchers designing these 3D chips is that the stacked chips produce a very high level of aggregated heat dissipation of close to 1 kilowatt in a volume of only half a cubic centimeter. IBM researchers point out that that level of heat dissipation is 10 times higher than any other human-made device and that power densities in stacked processor designs are higher than in both nuclear and plasma reactors.
Thomas Brunschwiler, project leader at IBM’s Zurich Research Laboratory said in a statement, “As we package chips on top of each other to significantly speed a processor's capability to process data, we have found that conventional coolers attached to the back of a chip don't scale. In order to exploit the potential of high-performance 3-D chip stacking, we need interlayer cooling. Until now, nobody has demonstrated viable solutions to this problem.”
The solution to the problem the IBM team devised is to pipe water through cooling structures, as thin as 50 microns, between the individual layers of the stacked chip. The scientists were able to show cooling performance of up to 180 W/cm2 per layer in a stack with a footprint of 4 cm2. One IBM researcher says that the cooling performance is a significant breakthrough and that without the breakthrough the stacking of two or more high-density power layers would be impossible.
In experiments, the researchers passed water into a 1 by 1 cm test device made up of a cooling layer between two heat sources. The cooling layer was only 100 microns in height (about twice the thickness of a human hair) and has 10,000 vertical interconnects per square cm.
The interconnects were hermetically sealed to prevent the water from causing electrical shorts within the chip. Individual layers were built using existing 3D packaging fabrication methods to etch the holes for signal transmission from one layer to the next. Each interconnect had a silicon wall around it and a fine layer of silicon oxide to insulate the electrical connections from the water. A new thin-film soldering technique was developed by the researchers to provide the precision and robustness needed to provide thermal contact for the cooling film.
IBM first announced its 3D chip stacking process in April of 2007.
New Military Supercomputer Breaks Performance Record
Roadrunner supercomputer is first to break petaflop barrier
A new supercomputer in the U.S. has broken a barrier that many thought wouldn’t be broken for years to come. A new supercomputer-- dubbed Roadrunner-- has broken the petaflop barrier.
Roadrunner was designed by engineers and scientists at IBM and the Los Alamos National Laboratory. Ultimately, Roadrunner will be placed into a classified environment where it will be used to simulate what effects aging has on the stockpile of nuclear weapons the U.S. has in its arsenal. The problem it will work on is modeling how aging nuclear weapons behave the first fraction of a second during an explosion. Before beginning its nuclear weapons research, Roadrunner will be used to model the effects of global warming.
The Roadrunner supercomputer costs $133 million and is built using chips from both consumer electronics and more common server processors.
Roadrunner has 12,960 chips that are an improved version of the Cell chip used in the PS3. These Cell processors act as a turbocharger for certain portions of the calculations the Roadrunner processes. The computer also uses a smaller, unspecified number of AMD Opteron processors.
A computer researcher from the University of Tennessee, Jack Dongarra told the New York Times, “This [breaking the petaflop barrier] is equivalent to the four-minute mile of supercomputing.”
Horst Simon from the Lawrence Berkley National Laboratory said, “Roadrunner tells us about what will happen in the next decade. Technology is coming from the consumer electronics market and the innovation is happening first in terms of cell phones and embedded electronics.”
Technology first appearing in the consumer electronics market and then making its way into supercomputing is a stark contrast to a process that commonly works in the exact opposite manner.
In total, Roadrunner has 116,640 processing cores and the real challenge for programmers is figuring out how to keep all of those processing cores in use simultaneously to get the best performance. Roadrunner requires about 3 megawatts of power, or about enough electricity to run a large shopping center.
To put the processing power in perspective, Thomas P. D’Agostino of the National Nuclear Security Administration said that if all 6 billion people on Earth entered calculations on a calculator for 24 hours a day, seven days per week it would take 46 years to do what Roadrunner can do in one day.
How Roadrunner is cooled is unknown, IBM has recently moved to liquid cooling for its supercomputers, but Roadrunner appears to be air cooled.
Ballmer to Retire in 10 Years; Looks Back on Happy, Rocky Relationship With Gates
With Ballmer's new announcement that he will retire within 10 years and Gates' retirement this year, its worth a look at the pair's past, present, and future.
Microsoft has owned the privilege of being the world's largest software company for well over a decade and it has strong executive leadership to thank for that, largely. Love them or hate them, Microsoft's founder and current Chairman Bill Gates and CEO Steve Ballmer, a key Microsoft veteran, have shaped the face and course of Microsoft and the tech industry as a whole.
When Gates stepped down as CEO in 2000 allowing Ballmer to step up, it was a historic moment for the company. That moment is nearing its finale this year, with Gates finally looking to fully retire. And as the future fast approaches, Ballmer just announced that he will follow Gates into retirement in just 10 years time or less.
With the departure of Gates and looking forward to the departure of Ballmer, it’s worth taking a look at how Microsoft has been shaped by the pair’s relationship – one that Ballmer once emotively compared to a marriage which produced many children. The Wall Street Journal provides inside insight into internal dealings that took place during the Gates-Ballmer transition and how their relationship survived its rockiest days.
Gates and Ballmer's story does not start with Microsoft; rather it begins far before that. They first met at Harvard University in the mid-1970s. They were the same age (both are now 52) and they shared a love for poker and intellectual challenges. One fond memory they hold is when they failed to attend a single economics class session, but through collaborative cramming managed to score near perfect marks -- 97 percent for Ballmer and 99 percent for Gates.
The scores were reflective of the pair’s early relationship and work at Microsoft. Both shared almost equal duties, but Gates always owned the dominant position by a hair. Gates took on the role of providing the chief software and business direction, while Ballmer filled in for other necessary top duties. There was no part of the business that the pair couldn't manage. Gates describes the time stating, "For a certain size organization, it was beautiful."
Like most pairs, the two had their share of heated arguments and fights, but they would quickly make up -- and get back to work. In the 90s Microsoft was forced to restructure, following government charges of monopoly and the threat of the burgeoning online industry. This culminated with Gates announcing Ballmer would replace him as CEO in 2000 and that he would begin the transition into retirement.
Bill Gates assumed the role of "Chief Software Architect" -- a role that was beneath Ballmer's. Gates, however, still thought of himself as top dog, and according to his own omission would offer sarcasm in important meetings, undermining Ballmer's leadership. Everything from personnel staffing to the Xbox to Windows itself became a battleground for the power struggle in which Gates refused to accept his subordinate role. The clashes had many casualties, among them the eventually defunct NetDocs program; elements of which survived to be incorporated into the Office program.
In 2001, the board and senior executives intervened, calling Gates and Ballmer into a meeting about the destructive effect their relationship was having. Jon Shirley, a former Microsoft president states, "The board was really concerned about what was going to happen."
In February 2001, perhaps the most important meeting in Microsoft's history took place at the Polaris restaurant in the Bellevue Club Hotel a few miles from Microsoft's campus. Gates and Ballmer have never revealed the details of this meeting, so the world may never know, but the overall gist was to establish Mr. Gates as the "junior partner" to Mr. Ballmer's "senior partner."
Ballmer pledged to learn when to override Gates and when to "let things go". He stated that after the meeting, "We got it figured out."
Meanwhile Gates began to defer to Ballmer to the shock of many. Microsoft Vice President Mich Matthews recalls executives exchanging bewildered glances during such an instance in an important meeting. Gates says he needed to do most of the changing. He stated, "Steve is all about being on the team, and being committed to the mutual goals. So I had to figure out, what are my behaviors that don't reinforce that? What is it about sarcasm in a meeting? Or just going, 'This is completely screwed up'?"
The result was an enriched Microsoft. Ballmer kept the technical savvy, but moved towards a model in which the executives took a greater managerial role, as opposed to being involved greatly in tech development. Surprising to some, despite his reputation for ebullience he did a commendable job making peace with various regulators and patent disputes.
In recent years, despite its troubles, Microsoft has had its share of shining success, such as Windows XP, and its modest ones, such as the Xbox program. And after surviving their greatest trial, Gates and Ballmer became incredibly close once again, so much so that they would at times complete each other's sentences. In a recent interview Ballmer, teared up, discussing the creation of Microsoft.
He reminisced, "It is a little like giving birth to something. Bill gave birth but I was kind of an early nanny in raising this child. There are fun things we get to do together, that's all nice. I mean, it's important, but this is..."
"...this is what we did," Gates added grinning, a bit misty-eyed himself.
Doubts remain. Gates delivered his last major speech to employees and customers this week, and will now be semi-retired. However, some think that if Microsoft enters a crisis, such as failing to rebound from the Vista slump, Gates won't be able to resist the temptation of a second coming with the company, much like Apple's Steve Jobs. Others are concerned about Microsoft's future because they say that neither Ballmer nor Gates can offer the young blood needed to solve such a crisis.
And there's Ballmer's upcoming retirement, which he just announced. In an interview, Ballmer commented that he would stay with Microsoft "for another nine or 10 years ... until my last kid goes away to college." While this may seem like a long time, decades have a habit of quickly slipping by amidst the unfaltering passage of time. Microsoft's brass are aware and Ballmer's announcement leaves many pondering what will become of Microsoft in a post-Gates and then in a post-Ballmer-Gates era.
One possibility for Ballmer's replacement is Lotus Notes founder and current Microsoft chief tech visionary Ray Ozzie, whom Gates once described as "one of the top five programmers in the universe." Ozzie is replacing Gates in much of his roles he's held since stepping down as CEO. Moving up to the CEO position would not be an unmanageable transition. However, Ozzie is almost the same age as Ballmer and Gates, so he make look to retire himself.
If Ozzie does retire, the future for Microsoft really is a mystery and wide open. Much of the company's budding and veteran leadership -- Joanne Bradford, Rob Short, Jeff Raikes, and Bruce Jaffe to name a few -- has left either to manage elsewhere or to the greener pastures of retirement.
There is much uncertainty with Gates leaving and now with Ballmer's own retirement clock ticking. However, whatever the future may hold for Microsoft, it is truly salient to look at the indelible mark that the relationship between Ballmer and Gates -- rocky at times, warm at others -- left on both the field of electronics and the economy in general in the last two decades.
Microsoft Sees OOXML Stalled Due to International Appeals
"The more you tighten your grip the more star systems will slip through your fingers"
Microsoft's hopes of controlling the open document world were nearing fruition after the International Standards Organization finally certified its OOXML standard at the start of April. The ISO had already ratified ODF, the competitive open-source format from the Organization for the Advancement of Structured Information Standards (OASIS) used heavily in Linux, but Microsoft faced a lengthy struggle to try to get its own format recognized. Without certification it would be tough to push the format as a legitimate open document option.
Microsoft had good reason to want to control the world of open documents. As users switch platforms and software more and more, and use an increasing amount of open source solutions, the need for a non-software specific format has surfaced. Microsoft hoped that by making its own proprietary open-file format the preferred standard it could seize control of this budding field.
However, to Microsoft's anger, the process has now been held up by complaints. Following rumors that Microsoft pushed the vote through and used underhanded tactics to suppress dissent, Brazil, India, South Africa, and Venezuela lent such claims credence by filing complaints against the ratification.
The ratification cannot go forward until these complaints are heard, and they must be voiced before the end of June. The final decision of how to react to them will be handed to two management committees. India in particular was quite vocal in its opposition. An open letter, written by a member of the technical standardization committee in India, states that Microsoft's long and ambiguous proposed specification left it unclear what was being implemented. He says this means that Microsoft can implement the new format however it wants, ruining the whole reason for ISO -- to promote openness.
He also accuses Microsoft of running a careful concerted smear campaign that undercut the Indian concerns. He states:
Microsoft started filing complaints to various Indian authorities in early March 2008, claiming bias on part of several members of the committee because of their presumed membership of a group called ‘ODF Alliance India’. My Institution and its representatives are part of the group which has been falsely implicated in these complaints. Worse, the complaints have painted these organizations and their representatives, including the Indian delegation which attended the BRM, as acting against the Indian National interests. This is the most derogatory accusation to any Indian, amounting, personally for me at least, to intolerable blasphemy.
In the letter he alleges that Microsoft pressured the Indian national government to change its stance, and likely did so with other national governments as well. He states that Microsoft behaved in a way "amounting to interfering with the governance process of a sovereign country." He concludes, "I would like to assure all colleagues and other readers that my intentions are purely to respond to the grave provocation caused by the actions of Microsoft."
Meanwhile ODF creators OASIS tried to steal a bit of the spotlight calling for an "implementation, interoperability, and conformity" technical committee to continue ODF's openness and quality. The entity plans on trying to bring ISO or the World Wide Web Consortium (W3C) into the project. Surprisingly Microsoft has expressed interest in joining the committee, igniting many conspiracy theories on the internet.
A Big Green Tax Cut; San Francisco Intensifies Solar Grant Efforts
San Francisco is looking to get off the grid and save money with a vast solar push
Independent solar power efforts are growing rapidly. With a number of businesses providing unique, online-coordinated installation options, individuals and small businesses are adopting the technology. And part of the new rate of adoption is thanks to local government grants.
Many cities and states give citizens large grants to bear some of the capital brunt of buying solar panels. These grants are in essence a big tax break as the consumer will typically make a good deal of money of the solar panels in their lifetime. They are the alternative energy version of the business world's small business grants.
This Tuesday, San Francisco looked to keep the good times rolling and put some green back its citizens' pockets with the approval of a massive new grant campaign. Solar panel manufacturers and installers received the news with giddy anticipation and are preparing for the new boom.
For the next ten years, citizens can get $3,000 to $6,000 in a one-time grant to install panels. Both businesses and charities are also receiving some solar love. Businesses and nonprofits can get $10,000 grants, while nonprofit affordable housing can get up to a whopping $30,000. Mayor Gavin Newsom states, "This rebate program further establishes San Francisco as America's solar energy leader and symbolizes the commitment of the city to make affordable solar power available to those who want it."
The mayor says the program should launch this July and will only cost the city $3 million yearly. He says the benefits are far reaching and go beyond just putting money back in the hands of consumers and businesses in energy cost savings. Newsom says the program will attract businesses and will grow green jobs.
If San Francisco can really pull of the green transformation it will be a significant accomplishment. In the green-savvy California, San Francisco has traditionally been somewhat of a laughing stock of the alternative energy community. The city was ranked last in the Bay Area by a recent assessment by the San Francisco Solar Task Force. Of the city's 195,000 rooftops, only 744 had solar panels, less than 1 percent.
The mayor hopes that the new efforts will panel nearly 10,000 rooftops over the decade, or roughly 5 percent of the city's rooftops. If successful, this would produce around 50 MW of power.
Lyndon Rive, CEO of installer SolarCity, whom DailyTech recently reported on, is thrilled by the effort. He anticipates the number of panels tripling and as the city's largest solar installer; he's in prime position for success. With 40 current employees his company is expanding with a "green" job training initiative in a low-income part of the city. Rive complements the new program stating, "It's simple, easy to understand, and easy to implement."
While his company offered solar leasing, he acknowledges that this strategy was not as cost effective and that most citizens couldn't afford it. Now between city, state, and federal tax credits, rebates, and grants, an average consumer who would have paid $30,000 for panels can pay a mere $6,000. Kevin Gage, sales director for San Diego-based installer Borrego Solar states, "This is just gonna spur the industry. The market was essentially shut down in San Francisco. Now a lot of companies like ours are gonna move into San Francisco."
Ironically the approval was announced the same day San Francisco utility Pacific Gas & Electric announced a 6.5 percent electricity rate hiking on surging fossil fuel costs. San Franciscan Sylvia Ventura is excited about the relief the move may provide her fellow citizens, but she's a bit fearful that the myriad of installers will confuse them. She states, "This business was done for a long time in the shadows and some installers took advantage of people being intimidated by the data, not understanding metering, wattage, and what to pay."
She and her husband Dan Barahona launched a new effort, One Block Off The Grid, which aims to use collective bargaining and other subsidies to further reduce the cost of the panels to an attractive price of "free". She says that the first 50 homeowners that sign up for the program will receive panels free of cost, thanks to the effort's clever negotiating. However, corporate partners are still in the process of being secured and the list is currently only half full.
Whether the new grant program is a glowing success or just a modest one, at the end of the day its putting money back in the hands of the hardworking tax payer. With rising energy and food costs, the consumer in San Francisco will finally get to see some light.
Intel Responds to AMD, NVIDIA USB 3.0 Allegations
Intel says open host controller specifications have cost gazillions of dollars to develop
According to Intel’s Nick Knupffer, there are a lot of myths going around concerning USB 3.0 and Intel’s involvement in the development of the specification. Knupffer wrote a blog post on Intel’s website in an attempt to dispel these myths.
Knupffer points out that Intel is not developing the USB 3.0 specification. What Intel is developing is the host controller spec which Knupffer describes as a “Dummies Guide” to building a USB 3.0 compatible piece of silicon.
Knupffer says in the blog post that Intel has invested “gazillions of dollars and bazillions of engineering man hours” in developing the open host controller and despite its significant investment still plans to give the specification to competing manufacturers for free. Knupffer also says that Intel loves it when CPU performance is used to the max and the huge increase in bandwidth of USB 3.0 will mean larger file transfers and more processor usage. This in turn is expected to lead to an increased demand for faster processors.
AMD and NVIDIA leveled allegations at Intel recently that claim Intel was withholding the open host controller specifications in an attempt to give itself a market advantage. Intel and AMD claim that by withholding the specification the lead Intel will have in bringing USB 3.0 compliant products to market will be in the six to nine month range.
Intel denied the allegations of withholding the open host controller specifications at the time AMD and NVIDIA made their charges public and announced they would be designing their own open host controller. In Knupffer’s blog post, he again says that Intel isn’t holding the open host controller specifications back from competitors.
According to Knupffer, the significant investment in the open host controller specifications is specifically to get USB 3.0 into the market faster, so why would it withhold the specification. Intel still maintains that the specifications aren’t ready and that it plans to give the specifications to other manufacturers in the second half of 2008.
The final myth that Knupffer addresses in his post is that USB 3.0 technology borrows heavily from technology used in PCI Express. Intel points out that it was involved with both the PCI-SIG and the USB-IF at the design stage for both PCI Express and for USB 3.0. The insinuation form Intel is that the technology that is similar in both devices was developed on its dime.
Evolution in Escherichia Coli Bacterium Observed During Lab Tests
E. Coli bacteria shows signs of evolution in lab testing
Despite an overwhelming body of scientific evidence, evolution is still a fiercely debate topic in some circles. Many people take evolution for granted, simply understanding that it is the theory accepted by the scientific community based on the strong supporting evidence, and remain relatively oblivious to the controversy.
However, the fact remains that yearly there are many protests and court cases in the U.S. and abroad where people try to block educational attempts to teach the theory of evolution and replace them with religious-based theories.
Fortunately for evolutionary scientists they now have perhaps the greatest piece of evidence of all -- the largest evolutionary leap observed to date. The experiment started inconspicuously, with researchers at Michigan State University in East Lansing by using a single Escherichia coli bacterium and its descendants to found 12 populations.
Over 44,000 generations were observed and only minor mutations were observed, as is typical in these kinds of studies. Typical beneficial mutations -- larger cell size, faster growth rates, and lower peak population densities -- were observed.
Then at generation 31,500 something shocking happened. The bacteria evolved, gaining an entirely new gene that could process citrate, a nutrient that the bacteria could not previously use. To put this in context, lack of citrate metabolism is one of E. coli's identifying traits. And the newly evolved bacteria proceeded to dominate over their citrate-intolerant kin.
Says researcher Richard Lenski, "It's the most profound change we have seen during the experiment. This was clearly something quite different for them, and it's outside what was normally considered the bounds of E. coli as a species, which makes it especially interesting."
Lenski says the only two explanations are either an extremely improbable mutation such as a rare chromosomal inversion, or a series of small mutation adding up to a useful new gene. Was the trait inevitable, guided by some all powerful hand? Lenski turned to his freezer for the answer. Unthawing the bacteria, from early generations, he found that pure chance had guided the evolutionary leap and that the bacteria did not evolve the trait. He did find that the later generations after 20,000 did evolve the trait eventually, indicating something happened around this time that laid the groundwork for the evolution.
He and his fellow researchers are currently studying exactly what change allowed for the eventual evolution. This experiment, however, proves that evolution does not always lead to best possible outcome (in that other lines did not achieve the same optimal trait). This has been a major point of contention raised by creationists who point to structures in nature that serve ornamental or little purpose as proof of creationism.
Further, it goes to show that profound changes can happen, including the introduction of entirely new genes. A particularly harsh criticism leveled in the past by was that profound genetic changes, including the creation of new genes, were never observed. Considering a few genes can account for profound morphological differences in larger organism, this is a very salient piece of evidence for evolution's supporters.
Jerry Coyne, an evolutionary biologist at the University of Chicago lauded the research and took a bit of an opportunity to poke fun at creationists saying, "The thing I like most is it says you can get these complex traits evolving by a combination of unlikely events. That's just what creationists say can't happen."
The findings are reported in the journal PNAS.
FCC to Hold Early Termination Fee Hearing Today
U.S. Customers Could Look Forward To Nicer Fees
The FCC will hold an open meeting today (PDF) on the topic of Early Termination Fees, bringing in a variety of panelists to discuss the future of a thorny practice that, while rankling consumers, is claimed as necessary by the cellular service industry.
Central to the discussion will be an industry-sponsored proposal that seeks to make massive changes in the way providers handle termination fees: under the proposed changes, customers would receive a 30-day grace period to cancel a contract after they sign it, and in the event of a terminated contract after that time, the applicable fees would be prorated down based on the contract's time remaining.
Traditionally, cell phone companies charge the same termination fee regardless of where a customer is in their contract -- fees stay the same regardless of whether they are 60 days in, or only have 60 days left. This policy, combined with an increasingly skyward rise in the fees themselves, recently resulted in a phalanx of class-action lawsuits against the industry as consumers become bitter over what they perceive to be company lock-in. Providers say the fees are necessary in order to subsidize customers' phones, which are frequently sold far below cost in order to make service plans more appealing.
The new iPhone 3G, with its $199/$299 price point in the United States, will be one of the first phones offered under these new rules. While AT&T customers are still required to sign a two-year agreement to buy the phone, if a customer chooses to terminate his or her contract they will only pay a prorated fee calculated from the time remaining on their contract. Purchasers of the original iPhone will remain bound to the old rules.
Meanwhile, a series of e-mail messages recently revealed by the Associated Press showed that some, if not all, providers in the cell phone industry exempt the government from termination charges.
"The government will never, never accept [a termination penalty] and for the most part I think a lot of the [complaining] is real," wrote Nextel (now Sprint Nextel) former marketing vice president Scott Weiner, in an "confidential" e-mail dated January 2004. It regarded a question of whether or not to assess termination fees for government subscribers that canceled their contract.
As it exists currently, cellular service regulation is handled in a "patchwork" fashion at the state level; industry representatives want the FCC to establish a national regulatory framework instead.
Appearing at the hearing are representatives from the trade group CTIA, DIRECTV, and Verizon, as well as a variety of professors, lawyers, and ordinary consumers.
Microsoft Exec,No Plans for iPhone Clone
Microsoft will keep focus on Windows Mobile and not worry about creating an iPhone clone
Microsoft has mobile operating systems designed for mobile phones and the Zune MP3 player, but does not have any plans to roll both services into a Microsoft-branded phone any time soon.
During an interview with the San Francisco Chronicle, Robbie Bach, Entertainment and Devices division president, said the company has no concrete plans to release an Apple iPhone clone in the future.
"We don't make phones ourselves. We don't have any plans to make phones ourselves. Our focus is on the belief that a phone is a very personal thing. Different people want different types of phones," Bach said during the interview.
Microsoft will instead focus on Windows Mobile, which has shipped on 20 million mobile devices already. When asked about Steve Jobs' announcement of the new 3G iPhone, Bach said Microsoft already ships "lots" of 3G phones that are powered by the Windows Mobile operating system.
Furthermore, Microsoft expects cell phones shipping with Windows Mobile will outsell the iPhone and RIM's BlackBerry smart phones, which use the OS X and a proprietary OS provided by RIM.
The demand for more interactive software will increase while the cell phone market, most notably smart phones, continues to rise.
"In general, what you are seeing in that phone space is tremendous growth in what people have called smart phones. (It's) growing dramatically. That means more opportunity for us. There is more opportunity for services on top of those phones. There is more opportunity for a richer experience. It's not just about the phone. It's about browsing. It's about music. It's about video. It's about e-mails, text messaging and photos."
Keeping the Zune simply a multimedia device may seem risky since many music listeners internationally are listening to music using their cell phone, but Microsoft believes the MP3 player market is still big enough to justify making stand alone MP3 players. Software is becoming increasingly important for owners of MP3 players, as Zune owners can already play TV episodes and Microsoft may add movies to its portfolio in the future.
In the rest of the interview, Bach discusses Blu-ray and the Nintendo Wii's international dominance and popularity.
Microsoft reached the 10 million consoles sold level before Nintendo or Sony, but the Wii is quickly catching up and is expected to overtake Microsoft in overall sales numbers shortly. But Microsoft understands that even though the consoles are competing in the same market.
Toshiba Boosts 1.8", 5400 RPM HDDs to 160GB
Toshiba takes the fight to SSDs with an expanded lineup of 5400RPM 1.8" HDDs
Back in late February, DailyTech reported on Toshiba's introduction of 5400 RPM 1.8" HDDs into the marketplace. Traditionally, 1.8" HDDs were only available with a 4200 RPM spindle speed, but the boost to 5400 RPM was a welcome addition to help improve performance on the smallest notebooks and UMPCs.
At the time of the announcement, Toshiba announced the availability of 80GB and 120GB models. Today, Toshiba's 5400 RPM 1.8" lineup is expanding to include a 160GB model and a revamped 80GB model.
The new 160GB (MK1617GSG) drive uses two platters while its new 80GB (MK8017GSG) counterpart used a single platter -- Toshiba's 80GB offering launched in February required two platters to reach the same capacity. Both drives feature 8MB of cache, 15ms average seek time, a micro-SATA connectors and comply with SATA 2.6 specifications.
"Toshiba's eight years in perfecting 1.8-inch HDD technology puts us in a unique position to address explosive growth in the mobility segment with proven products that deliver the performance and capacity that system manufacturers need," said Toshiba Storage Device Division Marketing VP, Maciek Brzeski.
At these capacities, our 1.8-inch HDDs are enabling the miniaturization of mobile PCs by providing better power consumption efficiency and improved ruggedness over larger form factors."
Toshiba’s new mobile HDDs will be available to OEMs in August of this year.
Toshiba's recent development in the area of 1.8" HDDs should give it more ammunition to go up against the increasing performance and falling costs of solid state drives (SSDs). Super Talent is currently leading all players in the SSD field by further driving down costs and recently introduced 30GB, 60GB, and 120GB 1.8" SSDs priced at $299, $449, and $679 respectively.