Tuesday, September 25, 2007

Free TV on the web!

Watch free streaming TV signals from all over the world using wwiTV.com. wwiTV.com is an index to streaming media available on the web.

http://wwitv.com/portal.htm

The World According to Ballmer

I just read this BusinessWeek article. It seems that Steve Ballmer was at Stanford yesterday, talking about Google among other things. It is funny to me that this was probably around the same time Google was having their 1st annual shareholder event (check out the webcast).

Steve is an awesome individual. I met him at an Association of Internet Professionals get together years ago in Chicago. He is hard to describe but is very passionate, to say the least, about Microsoft and technology.

Personalize Google

Here is a new Beta from Google, a personalized start page that even includes a preview of your Gmail.

http://www.google.com/ig

Here is something from the Google Blog at:
http://googleblog.blogspot.com/2005/05/method-to-our-madness.html

A method to our madness
5/19/2005 02:58:00 PM

Posted by Marissa Mayer, Director of Consumer Web Products, and Jessica Ewing, Product Manager

Does Google have a strategy, or are we just a bunch of mad computer scientists running around building whatever we want? Today this question gets an answer: we've launched our personalized homepage via Google Labs. It's part of a strategic initiative we refer to as 'fusion' to bring together Google functionality, and content from across the web, in useful ways.

The personalized homepage is a complement to the existing Google homepage - not a replacement. Keep using the original Google homepage if you want to. (We expect many people will.) But if you're keen to organize and customize your information, take a stab at designing your own homepage. You can add Gmail, news, stocks, weather and more. Plus you can add great content from websites like the BBC and Wired. We're incorporating feeds from just a few other sites today, but we envision being able to accept any standardized feed very soon.

Word of the Day: Hyperbolic

adj.
Of, relating to, or employing hyperbole.
Mathematics.
Of, relating to, or having the form of a hyperbola.
Of or relating to a geometric system in which two or more lines can be drawn through any point in a plane and not intersect a given line in the plane.
Of or relating to a hyperbolic function: hyperbolic cosine.

Link (above) now includes pronunciation for those challenged with that, like me.

Here it is used in a blog:

The Complete Mac-Compatible New Yorker

The New Yorker Store is offering "every page of every issue of America's leading magazine" on fully searchable DVDs. The Complete New Yorker will include every issue of the New Yorker from February 1925 to February 2005, "providing a detailed yet panoramic history of the life of the city, the nation, and the world during the most exciting and astounding decades any society has ever known." [emphasis mine]

Wow. That last bit is a little hyperbolic isn't it? I kind of think the time of the Visigoths may have us beat, but that's just me. In any case, the New Yorker is full of such hyperbolic hubris, has been for years, and we all love it. I especially dig the cartoons. The really good news is that The Complete New Yorker will only set you back $100, is fully searchable, and is touted as being fully compatible with Mac OS 10.3 and higher.

posted by Michael at 10:31 AM

<< Home

--------------------------------------------------------------------------------
AboutListings of technology related information found on the internet that is either really useful or a complete waste of time...you be the judge.

About Me
Name: Michael Bazzoni
Location: Lake Zurich, Illinois, United States
View my complete profile

PreviousA hush comes over the crowd
Microsoft jumps into RSS feet first
Weird World of Podcasting
Apple Wows WWDC crowd
Next Microsoft Office file format revealed...
Obstinate
Apple iTunes will support Podcasting
Beep this!
A- B- and C-list bloggers
Personalize Google

Have a Wireless Router? Microsoft will find it...

I just read this on MSNBC:

Microsoft tracks WiFi for new mapping system

In a new initiative, Microsoft has dispatched cars to trawl many city and suburban streets across the U.S. to locate the signals sent out by millions of short-range home and office wireless (or WiFi) networks.

Be on the lookout for Microsoft cars in your neighborhood.

Blogger disses Apple and it's users

Blogger announced today that they have a plug-in for Microsoft Word so that you can blog using Word and save to your Blog or to your PC. The only problem is that you must have Windows 2000 or Windows XP and Word 2000 or above...no Mac version.

This seems strange to me as there has been all kinds of rumors about Google and Apple doing some sort of co-brand for the iTunes music store.

I wonder if Google is having internal issues with communication. It's either that or they are riding the fence on the Apple vs. Microsoft vs. Linux debate and figure they need to play nice with everyone for now.

UPDATE: Seems that Blogger wanted to do an Apple version but couldn't find a developer...from TUAW.com

Google free WiFi no longer a rumor

Well, it's here! Free WiFi provided by Google!

Here is a good writeup on it: http://blog.searchenginewatch.com/blog/050920-071042

Here a link to the test program:http://wifi.google.com/download.html

Here is a link to the FAQ:http://wifi.google.com/faq.html

So far it is a Beta only in San Francisco, but soon it may be in a town near you. If anyone has some actual screenshots of them using this service, I will post links to them here.

Some people said this would never happen, much like they are saying that the Google OS will never happen....we have to wait and see...

Microsoft announces something Live

Here are a few new BETA programs just announced by Bill G moments ago...

Microsoft Windows Live - http://ideas.live.com/

From Microsoft:
What is Windows Live?
Your online world gets better when everything works simply and effortlessly together. That's the basic idea behind Windows Live. So the things you care about - your friends, the latest information, your e-mails, powerful search, your PC files, everything comes together in one place. This is a brand new Internet experience designed to put you in control. And this is just the beginning you'll see many more new services in the coming months.

Microsoft Office Live is coming. http://www.microsoft.com/office/officelive/default.mspx

From Microsoft:
Today, an online presence is almost a requirement for small business success. That'’s why Microsoft is introducing Microsoft Office Live —a set of affordable business productivity services designed to help you grow your business more easily by establishing a professional presence online.

Microsoft Office Live will provide your company with its own domain name, Web site, and e-mail accounts for free.

Apple removing the boot from OS?

From the Motley Fool: I think Apple's (Nasdaq: AAPL) going to be offering instant-on computers in the very near future.

I remember going to the Office 2000 release party in Chicago. Steve Ballmer was there and they gave the crowd a preview of Windows XP. In the demo they said (and showed) that the Start button would appear faster on the screen, in essence, a shorter boot time. This got them a standing ovation.

Since then we learned that even though the Start button does appear faster in XP than in previous OS releases it is still not a faster boot. Services and devices are still being brought online in the background and some systems need to wait for several minutes before the booting process has completed and then they can be used.

It would really be great to see a system and an OS boot instantaneously. That would be a huge step into the future. With memory prices dropping and technology advancing it would be great to see this sooner than later.

That being said, this is not just needed in the computer arena. When I turn on my Cell Phone it has a boot time. When I turn on my Palm it has a boot time. We need to see memory used more to preload the services that are needed to operate all kinds of devices. Once we have that who knows were we can go...

What will Apple do on Tuesday?

What will Apple do on Tuesday?

Well, Microsoft, Google, etc. made it through CES and did their presentations, now it is Apple's turn. I am on the edge of my seat on this weeks announcments. I wish I could be there.

I thought now would be a good time for me to make my own predictions of what Steve Jobs may unveil this week.

DISCLAIMER: These are my guesses as I sit here reading all kinds of rumor Blogs and with no knowledge of what really will happen, so please...no wagering...I just want to see if I am even close on one of these...

My Guesses:

iLife '06 - updated favorites, plus some new things, like a basic video production tool for podcasts, iTunes version 5 with ability to backup video, FrontRow for every system and iWeb included will setup your own Blog or Podcast space (for .Mac users)

iPod - new 4GB and 6GB nano, now with video. new 80GB or 100GB iPod with bigger screen for better video.

Airport Express - now supports video and/or TV hookup for use with FrontRow.

.Mac - now supports automated backup of ITMS audio and video files. Blog and Podcast support added.

Intel inside - Low price wide screen laptop price starting at $599 and the mac mini price starting at $399.

Long shot predictions:

New device, using Intel processor, that replaces your DVR (maybe a Tivo dual deal since they were quiet at CES) that also has a wireless feature to communicate to things like iTunes.

Windows XP - all systems will have the option to run Windows XP OS and software inside OS X without a dual boot needed.

I will post a recap of how I did on Tuesday.

Windows XP on a Mac, now available from Apple!

Wow, I thought this would happen, but I am shocked it happened today.

Apple has just opened up Boot Camp Public Beta. This allows Intel based Mac systems to dual boot OSX and Windows XP.

Here is something from the site:

Boot Camp lets you install Windows XP without moving your Mac data, though you will need to bring your own copy to the table, as Apple Computer does not sell or support Microsoft Windows. Boot Camp will burn a CD of all the required drivers for Windows so you don't have to scrounge around the Internet looking for them.

Now I need one of those Intel Macs...

Google Releases Spreadsheets Beta

I got into the Google Spreadsheets Beta today. So far it is really cool. Much like the Writely Beta. Both work great on a Mac as long as you are using Firefox.

The only concern I have so far, with both Spreadsheets and Writely, is your requirement to be on the Internet to access your data.

Over the weekend we had an almost complete Comcast shut down in our house. Ended up being a weak signal and some bad wiring (which Comcast fixed quickly). But over the weekend we were sans Internet. I had no way to get my data. Sure, I can save the data offline, but then I need to manage my space myself. Maybe Google will come up with a way that you can keep your data offline and have it sync. At least that's a suggestion I will be sending on to Google.

For now, if you are a PC user, I would check out the Microsoft Office Beta as well. I have been on it for a week at work and it is really great. I am still putting it through the paces of daily work, so I will post what I find in a later post.

BTW: What's next for Google? Google Presentations? I see an entire office suite being created...

Technorati Tags: Google, Writely, Microsoft, Mac

Monday, September 24, 2007

Moore's Law

Moore's Law
From Wikipedia, the free encyclopedia
(Redirected from Moore's law)• Learn more about using Wikipedia for research •Jump to: navigation, search

Growth of transistor counts for Intel processors (dots) and Moore's Law (upper line=18 months; lower line=24 months)For the observation regarding information retrieval, see Mooers' Law.
Moore's Law describes an important trend in the history of computer hardware: that the number of transistors that can be inexpensively placed on an integrated circuit is increasing exponentially, doubling approximately every two years. The observation was first made by Intel co-founder Gordon E. Moore in a 1965 paper.[1][2] The trend has continued for more than half a century and is not expected to stop for a decade at least and perhaps much longer.[3]

Almost every measure of the capabilities of digital electronic devices is linked to Moore's Law: processing speed, memory capacity, even the resolution of LCD screens and digital cameras. All of these are improving at (roughly) exponential rates as well. This has dramatically changed the usefulness of digital electronics in nearly every segment of the world economy.[4] Moore's Law is a driving force of technological and social change in the late 20th and early 21st centuries.

Contents [hide]
1 History
2 Formulations of Moore's Law
3 A self-fulfilling prophecy: industry struggles to keep up with Moore's Law
4 Future trends
5 Ultimate limits of the law
6 Futurists and Moore's Law
7 Software: breaking the law
8 Other considerations
9 See also
10 References and notes
11 External links
11.1 Articles
11.2 Data
11.3 FAQs



[edit] History

Gordon Moore's original graph from 1965Moore's original statement can be found in his publication "Cramming more components onto integrated circuits", Electronics Magazine 19 April 1965:

“ The complexity for minimum component costs has increased at a rate of roughly a factor of two per year ... Certainly over the short term this rate can be expected to continue, if not to increase. Over the longer term, the rate of increase is a bit more uncertain, although there is no reason to believe it will not remain nearly constant for at least 10 years. That means by 1975, the number of components per integrated circuit for minimum cost will be 65,000. I believe that such a large circuit can be built on a single wafer.[1] ”

The term Moore's Law was coined around 1970 by the Caltech professor, VLSI pioneer, and entrepreneur Carver Mead.[5][2] Moore may have heard Douglas Engelbart, a co-inventor of today's mechanical computer mouse, discuss the projected downscaling of integrated circuit size in a 1960 lecture.[6] In April 2005, Intel offered $10,000 to purchase a copy of the original Electronics Magazine.[7] David Clark, an engineer living in the UK, was the first to find a copy and offer it to Intel.[8]

In 1975, Moore projected a doubling only every two years. He is adamant that he himself never said "every 18 months", but that is how it has been quoted.[9] The SEMATECH roadmap follows a 24 month cycle.


[edit] Formulations of Moore's Law

PC hard disk capacity (in GB). The plot is logarithmic, so the fit line corresponds to exponential growth.Several measures of digital technology are improving exponentially.

Transistors per integrated circuit. The most popular formulation is of the doubling of the number of transistors on integrated circuits every two years. At the end of the 1970s, Moore's Law became known as the limit for the number of transistors on the most complex chips. Recent trends show that this rate has been maintained into 2007.

Cost per transistor. An extremely clear formulation.

Density at minimum cost per transistor. This is the formulation given in Moore's 1965 paper.[1] It is not about just the density of transistors that can be achieved, but about the density of transistors at which the cost per transistor is the lowest.[10] As more transistors are put on a chip, the cost to make each transistor decreases, but the chance that the chip will not work due to a defect increases. In 1965, Moore examined the density of transistors at which cost is minimized, and observed that, as transistors were made smaller through advances in photolithography, this number would increase at "a rate of roughly a factor of two per year".[1]

Computing power per unit cost. It is also common to cite Moore's Law to refer to the rapidly continuing advance in computing power per unit cost, because increase in transistor count is also a rough measure of computer processing power. On this basis, the power of computers per unit cost - or more colloquially, "bang per buck" - doubles every 24 months (or, equivalently, increases 32-fold every 10 years).

Hard disk storage cost per unit of information. A similar law (sometimes called Kryder's Law) has held for hard disk storage cost per unit of information.[11] The rate of progression in disk storage over the past decades has actually sped up more than once, corresponding to the utilization of error correcting codes, the magnetoresistive effect and the giant magnetoresistive effect. The current rate of increase in hard drive capacity is roughly similar to the rate of increase in transistor count. Recent trends show that this rate has been maintained into 2007.

RAM storage capacity. Another version states that RAM storage capacity increases at the same rate as processing power.

Data per optical fiber. According to Gerry/Gerald Butters,[12][13] the former head of Lucent's Optical Networking Group at Bell Labs, there is another version, called Butter's Law of Photonics,[14] a formulation which deliberately parallels Moore's law. Butter's Law [15] says that the amount of data coming out of an optical fiber is doubling every nine months. Thus, the cost of transmitting a bit over an optical network decreases by half every nine months. The availability of wavelength-division multiplexing (sometimes called "WDM") increased the capacity that could be placed on a single fiber by as much as a factor of 100. Optical networking and DWDM is rapidly bringing down the cost of networking, and further progress seems assured. As a result, the wholesale price of data traffic collapsed in the dot-com bubble.


Pixels per dollar based on Australian recommended retail price of Kodak digital camerasPixels per dollar. Similarly, Barry Hendy of Kodak Australia has plotted the "pixels per dollar" as a basic measure of value for a digital camera, demonstrating the historical linearity (on a log scale) of this market and the opportunity to predict the future trend of digital camera price and resolution.


[edit] A self-fulfilling prophecy: industry struggles to keep up with Moore's Law
Although Moore's Law was initially made in the form of an observation and forecast, the more widely it became accepted, the more it served as a goal for an entire industry. This drove both marketing and engineering departments of semiconductor manufacturers to focus enormous energy aiming for the specified increase in processing power that it was presumed one or more of their competitors would soon actually attain. In this regard, it can be viewed as a self-fulfilling prophecy.

The implications of Moore's Law for computer component suppliers are very significant. A typical major design project (such as an all-new CPU or hard drive) takes between two and five years to reach production-ready status. In consequence, component manufacturers face enormous timescale pressures—just a few weeks of delay in a major project can spell the difference between great success and massive losses, even bankruptcy. Expressed as "a doubling every 18 months", Moore's Law suggests phenomenal progress for technology over the span of a few years. Expressed on a shorter timescale, however, this equates to an average performance improvement in the industry as a whole of close to 1% per week. Thus, for a manufacturer in the competitive CPU market, a new product that is expected to take three years to develop and turns out just three or four months late is 10 to 15% slower, bulkier, or lower in capacity than the directly competing products, and is close to unsellable. If instead we accept that performance will double every 24 months, rather than every 18 months, a 3 to 4 month delay would translate to 8-11% lower performance.

As the cost of computer power to the consumer falls, the cost for producers to fulfill Moore's Law follows an opposite trend: R&D, manufacturing, and test costs have increased steadily with each new generation of chips. As the cost of semiconductor equipment is expected to continue increasing, manufacturers must sell larger and larger quantities of chips to remain profitable. (The cost to tape-out a chip at 180 nm was roughly US$300,000. The cost to tape-out a chip at 90 nm exceeds US$750,000, and is expected to exceed US$1,000,000 for 65 nm.) In recent years, analysts have observed a decline in the number of "design starts" at advanced process nodes (130 nm and below for 2007). While these observations were made in the period after the 2000 economic downturn, the decline may be evidence that traditional manufacturers in the long-term global market cannot economically sustain Moore's Law.


[edit] Future trends
Computer industry technology "road maps' predict (as of 2001) that Moore's Law will continue for several chip generations. Depending on the doubling time used in the calculations, this could mean up to a hundredfold increase in transistor count per chip within a decade. The semiconductor industry technology roadmap uses a three-year doubling time for microprocessors, leading to a tenfold increase in the next decade.[16] Intel was reported in 2005 as stating that the downsizing of silicon chips with good economics can continue during the next decade.[17]

Some of the new directions in research that may allow Moore's law to continue are:

Intel's prediction of increasing use of materials other than silicon was verified in mid-2006, as was its intent of using trigate transistors from around 2009 [citation needed].
Researchers from IBM and Georgia Tech created a new speed record when they ran a silicon/germanium helium supercooled transistor at 500 gigahertz (GHz).[18] The transistor operated above 500 GHz at 4.5 K (—451°F)[19] and simulations showed that it could likely run at 1 THz (1,000 GHz), although this was only a single transistor, and practical desktop CPUs running at this speed are extremely unlikely using contemporary silicon chip techniques [citation needed].
In early 2006, IBM researchers announced that they had developed a technique to print circuitry only 29.9 nm wide using deep-ultraviolet (DUV, 193-nanometer) optical lithography. IBM claims that this technique may allow chipmakers to use current methods for seven years while continuing to achieve results forecast by Moore's Law. New methods that can achieve smaller circuits are expected to be substantially more expensive.
On January 27, 2007, Intel demonstrated a working 45nm chip codenamed "Penryn", intending mass production to begin in late 2007.[20] A decade ago, chips were built using a 500 nm process.
Companies are working on using nanotechnology to solve the complex engineering problems involved in producing chips at the 32 nm and smaller levels. (The diameter of an atom is on the order of 0.1 nm.)
While this time horizon for Moore's Law scaling is possible, it does not come without underlying engineering challenges. One of the major challenges in integrated circuits that use nanoscale transistors is increase in parameter variation and leakage currents. As a result of variation and leakage, the design margins available to do predictive design are becoming harder. Such systems also dissipate considerable power even when not switching. Adaptive and statistical design along with leakage power reduction is critical to sustain scaling of CMOS. A good treatment of these topics is covered in Leakage in Nanometer CMOS Technologies. Other scaling challenges include:

The ability to control parasitic resistance and capacitance in transistors,
The ability to reduce resistance and capacitance in electrical interconnects,
The ability to maintain proper transistor electrostatics to allow the gate terminal to control the ON/OFF behavior,
Increasing effect of line edge roughness,
Dopant fluctuations,
System level power delivery,
Thermal design to effectively handle the dissipation of delivered power, and
Solving all these challenges at an ever-reducing manufacturing cost of the overall system.

[edit] Ultimate limits of the law
On April 13, 2005, Gordon Moore himself stated in an interview that the law cannot be sustained indefinitely: "It can't continue forever. The nature of exponentials is that you push them out and eventually disaster happens." and noted that transistors would eventually reach the limits of miniaturization at atomic levels:

“ In terms of size [of transistor] you can see that we're approaching the size of atoms which is a fundamental barrier, but it'll be two or three generations before we get that far—but that's as far out as we've ever been able to see. We have another 10 to 20 years before we reach a fundamental limit. By then they'll be able to make bigger chips and have transistor budgets in the billions.[21] ”

Krauss and Starkman announced an ultimate limit of around 600 years in their paper "Universal Limits of Computation", based on rigorous estimation of total information-processing capacity of any system in the Universe.

Then again, the law has often met obstacles that appeared insurmountable, before soon surmounting them. In that sense, Moore says he now sees his law as more beautiful than he had realised: "Moore's Law is a violation of Murphy's Law. Everything gets better and better."[22]


[edit] Futurists and Moore's Law

Kurzweil expansion of Moore's Law from integrated circuits to earlier transistors, vacuum tubes, relays and electromechanical computers.Extrapolation partly based on Moore's Law has led futurists such as Vernor Vinge, Bruce Sterling, and Ray Kurzweil to speculate about a technological singularity. Kurzweil projects that a continuation of Moore's Law until 2019 will result in transistor features just a few atoms in width. Although this means that the strategy of ever finer photolithography will have run its course, he speculates that this does not mean the end of Moore's Law:

“ Moore's Law of Integrated Circuits was not the first, but the fifth paradigm to forecast accelerating price-performance ratios. Computing devices have been consistently multiplying in power (per unit of time) from the mechanical calculating devices used in the 1890 U.S. Census, to [Newman's] relay-based "Robinson" machine that cracked the [German Lorenz cipher], to the CBS vacuum tube computer that predicted the election of Eisenhower, to the transistor-based machines used in the first space launches, to the integrated-circuit-based personal computer.[23] ”

Thus, Kurzweil conjectures that it is likely that some new type of technology will replace current integrated-circuit technology, and that Moore's Law will hold true long after 2020. He believes that the exponential growth of Moore's Law will continue beyond the use of integrated circuits into technologies that will lead to the technological singularity. The Law of Accelerating Returns described by Ray Kurzweil has in many ways altered the public's perception of Moore's Law. It is a common (but mistaken) belief that Moore's Law makes predictions regarding all forms of technology, when it actually only concerns semiconductor circuits. Many futurists still use the term "Moore's Law" in this broader sense to describe ideas like those put forth by Kurzweil.


[edit] Software: breaking the law
A sometimes misunderstood point is that exponentially improved hardware does not necessarily imply exponentially improved software performance to go with it. The productivity of software developers most assuredly does not increase exponentially with the improvement in hardware, but by most measures has increased only slowly and fitfully over the decades. Software tends to get larger and more complicated over time, and Wirth's law even states humorously that "Software gets slower faster than hardware gets faster".

There are problems where exponential increases in processing power are matched or exceeded by exponential increases in complexity as the problem size increases. (See computational complexity theory and complexity classes P and NP for a (somewhat theoretical) discussion of such problems, which occur very commonly in applications such as scheduling.)

Due to the mathematical power of exponential growth (similar to the financial power of compound interest), seemingly minor fluctuations in the relative growth rates of CPU performance, RAM capacity, and disk space per dollar have caused the relative costs of these three fundamental computing resources to shift markedly over the years, which in turn has caused significant changes in programming styles. For many programming problems, the developer has to decide on numerous time-space tradeoffs, and throughout the history of computing these choices have been strongly influenced by the shifting relative costs of CPU cycles versus storage space.


[edit] Other considerations
Not all aspects of computing technology develop in capacities and speed according to Moore's Law. Random Access Memory (RAM) speeds and hard drive seek times improve at best a few percentage points each year. Since the capacity of RAM and hard drives is increasing much faster than is their access speed, intelligent use of their capacity becomes more and more important. It now makes sense in many cases to trade space for time, such as by precomputing indexes and storing them in ways that facilitate rapid access, at the cost of using more disk and memory space: space is getting cheaper relative to time.

Moreover, there is a popular misconception that the clock speed of a processor determines its speed, also known as the Megahertz Myth. This actually also depends on the number of instructions per tick which can be executed (as well as the complexity of each instruction, see MIPS, RISC and CISC), and so the clock speed can only be used for comparison between two identical circuits. Of course, other factors must be taken into consideration such as the bus width and speed of the peripherals. Therefore, most popular evaluations of "computer speed" are inherently biased, without an understanding of the underlying technology. This was especially true during the Pentium era when popular manufacturers played with public perceptions of speed, focusing on advertising the clock rate of new products.[24]

It is also important to note that transistor density in multi-core CPUs does not necessarily reflect a similar increase in practical computing power, due to the unparallelised nature of most applications.


[edit] See also
Amdahl's law
Bell's Law
Experience curve effects
Exponential growth
Gates' Law
History of computing hardware (1960s-present)
Hofstadter's Law
Kryder's Law
Logistic growth
Observations named after people
Quantum Computing
Rock's Law
Second Half of the Chessboard
Semiconductor
Wirth's Law "Software gets slower faster than hardware gets faster."

Sunday, September 23, 2007

HOW JAVA WORKS?

Have you ever wondered how computer programs work? Have you ever wanted to learn how to write your own computer programs? Whether you are 14 years old and hoping to learn how to write your first game, or you are 70 years old and have been curious about computer programming for 20 years, this article is for you. In this edition of HowStuffWorks, I'm going to teach you how computer programs work by teaching you how to program in the Java programming language.

In order to teach you about computer programming, I am going to make several assumptions from the start:

I am going to assume that you know nothing about computer programming now. If you already know something then the first part of this article will seem elementary to you. Please feel free to skip forward until you get to something you don't know.

I am going to assume you do know something about the computer you are using. That is, I am going to assume you already know how to edit a file, copy and delete files, rename files, find information on your system, etc.

For simplicity, I am going to assume that you are using a machine running Windows 95, 98, 2000, NT or XP. It should be relatively straightforward for people running other operating systems to map the concepts over to those.

I am going to assume that you have a desire to learn.
All of the tools you need to start programming in Java are widely available on the Web for free. There is also a huge amount of educational material for Java available on the Web, so once you finish this article you can easily go learn more to advance your skills. You can learn Java programming here without spending any money on compilers, development environments, reading materials, etc. Once you learn Java it is easy to learn other languages, so this is a good place to start.

Having said these things, we are ready to go. Let's get started!



NEXT Inside This Article
1. Introduction to How Java Works 2. A Little Terminology 3. Downloading the Java Compiler 4. Your First Program 5. Understanding What Just Happened 6. Bugs and Debugging 7. Variables 8. Looping 9. Lots More Information 10. See all Software articles

Does adding more RAM to your computer make it faster?

Up to a point, adding RAM (random access memory) will normally cause your computer to feel faster on certain types of operations. RAM is important because of an operating system component called the virtual memory manager (VMM).

When you run a program such as a word processor or an Internet browser, the microprocessor in your computer pulls the executable file off the hard disk and loads it into RAM. In the case of a big program like Microsoft Word or Excel, the EXE consumes about 5 megabytes. The microprocessor also pulls in a number of shared DLLs (dynamic link libraries) -- shared pieces of code used by multiple applications. The DLLs might total 20 or 30 megabytes. Then the microprocessor loads in the data files you want to look at, which might total several megabytes if you are looking at several documents or browsing a page with a lot of graphics. So a normal application needs between 10 and 30 megabytes of RAM space to run. On my machine, at any given time I might have the following applications running:

A word processor
A spreadsheet
A DOS prompt
An e-mail program
A drawing program
Three or four browser windows
A fax program
A Telnet session
Besides all of those applications, the operating system itself is taking up a good bit of space. Those programs together might need 100 to 150 megabytes of RAM, but my computer only has 64 megabytes of RAM installed.
The extra space is created by the virtual memory manager. The VMM looks at RAM and finds sections of RAM that are not currently needed. It puts these sections of RAM in a place called the swap file on the hard disk. For example, even though I have my e-mail program open, I haven't looked at e-mail in the last 45 minutes. So the VMM moves all of the bytes making up the e-mail program's EXE, DLLs and data out to the hard disk. That is called swapping out the program. The next time I click on the e-mail program, the VMM will swap in all of its bytes from the hard disk, and probably swap something else out in the process. Because the hard disk is slow relative to RAM, the act of swapping things in and out causes a noticeable delay.

If you have a very small amount of RAM (say, 16 megabytes), then the VMM is always swapping things in and out to get anything done. In that case, your computer feels like it is crawling. As you add more RAM, you get to a point where you only notice the swapping when you load a new program or change windows. If you were to put 256 megabytes of RAM in your computer, the VMM would have plenty of room and you would never see it swapping anything. That is as fast as things get. If you then added more RAM, it would have no effect.

Some applications (things like Photoshop, many compilers, most film editing and animation packages) need tons of RAM to do their job. If you run them on a machine with too little RAM, they swap constantly and run very slowly. You can get a huge speed boost by adding enough RAM to eliminate the swapping. Programs like these may run 10 to 50 times faster once they have enough RAM!

Here are some interesting links:

How RAM Works
How Virtual Memory Works

Check your virtual memory settings
About the Virtual Memory Manager

Wednesday, September 19, 2007

Biting on aluminum foil can be painful. Why?

Biting on aluminum foil can be painful and is usually noticed if you have metal in your mouth from dental work (e.g. fillings, crowns). Basically, when you bite on foil, you set up a battery in your mouth and the electrical current stimulates nerve endings in your tooth. Here is what happens:
pressure from biting brings two dissimilar metals (aluminum foil, mercury in fillings or gold in crowns) in contact in a moist, salty environment (saliva)
the two metals have an electrochemical potential difference or voltage across them
electrons flow from the foil into the tooth (i.e. electrical current)
the current gets conducted into the tooth's root, usually by the filling or crown
the current sets off a nerve impulse in the root's nerve
the nerve impulse is sent to the brain
the brain interprets the impulse as pain
The production of electric current between two metals in contact is called the voltaic effect after Alessandro Volta, who discovered it. Early batteries were made by stacking metal discs together in a pile called a voltaic pile.
If you have no metal dental work in your mouth, you should not feel this effect.

VISITING NASA

NASA Visitors Center and Tours


The general public is welcome to visit and tour many NASA installations. Some NASA Centers operate their own visitors centers, and others have contractual arrangements with private firms. Admission fees are charged at some sites. To check on hours, admission and tour availability, visit these Web pages:

NASA Headquarters, 300 E St. SW, Washington, D.C.
NASA Headquarters does not have its own visitors' center, but it is located four blocks south of the Smithsonian Institution's National Air & Space Museum, which houses an extensive collection of aeronautics and space program artifacts. The NASA Headquarters Information Center offers some information brochures, mission decals and posters for free and sells various NASA publications and directories by phone or mail.

If you're interested in exploring more space related sites in the Washington DC area check out our Interactive Map.

Ames Research Center
Moffett Field, Calif. 94035-1000 (South San Francisco Bay Area). Phone: (650) 604-6274
+ View Site

Dryden Flight Research Center
On Edwards Air Force Base (approximately 2 hours north of Los Angeles, California.) Bldg 4839, 4800 Lilly Drive, Edwards, Calif. 93523-0273, Phone: (661) 276-3449
+ View Site

Glenn Research Center
at Lewis Field, 21000 Brookpark Road, Cleveland, Ohio 44135-3191, Phone: (216) 433-2001
+ View Site

Goddard Space Flight Center
Greenbelt, Md. (15 miles east of Washington, D.C.) Greenbelt Road, Greenbelt, Md. 20771-0001, Phone: (301) 286-8981
+ View Site

Jet Propulsion Laboratory
Pasadena, Calif. (near Los Angeles). 4800 Oak Grove Drive, Pasadena, CA 91109-8099, Phone: (818) 354-9314
+ View Site

Johnson Space Center
Houston. The privately operated Space Center Houston. Admission fee. Space Center Houston, Johnson Space Center, Houston, TX 77058, Phone: (281) 244-2105
+ View Site

Kennedy Space Center
Cape Canaveral, Florida, east of Orlando. Spaceport U.S.A., Kennedy Space Center, Kennedy Space Center, FL 32899-0001, Phone: (321) 452-2121
+ View Site

Seeing a Space Shuttle launch
+ View Site

Langley Research Center
Hampton, Virginia. Visitors center located at the Virginia Air & Space Center in downtown Hampton.
+ View Site

Marshall Space Flight Center
Huntsville, Alabama. The privately operated U.S. Space & Rocket Center. Admission fee. One Tranquility Base, Huntsville, AL 35805, Phone: (256) 837-3400
+ View Site

Stennis Space Center
Bay St. Louis, Mississippi. Free admission. Visitors Center, Stennis Space Center, MS 39529-6000, Phone: (228) 688-2370
+ View Site

Wallops Flight Facility
Wallops Island, Virginia (on Virginia's Eastern Shore). Free admission. Bldg. J-17, Wallops Island, VA 23337, Phone: (757) 824-2298
+ View Site

Note: The privately run commercial Space Camps are not owned or operated by NASA.

ABOUT NASA

Updated Sept. 11, 2007 -- NASA's Opportunity rover completed activities on Sept. 8 and 9 as planned, so controllers sent commands instructing the rover to enter Victoria Crater today. The commands call for Opportunity to descend about 10 feet into the crater, getting all six wheels inside the rim, and then to back out and assess how much its wheels slipped on the inner slope.

PASADENA, Calif. - Two months after sky-darkening dust from severe storms nearly killed NASA's Mars exploration rovers, the solar powered robots are awake and ready to continue their mission.

Opportunity's planned descent into the giant Victoria Crater was delayed, but now the rover is preparing to drive into the 800-meter-diameter crater (half-mile-diameter) as early as Sept. 11.

Image right: Opportunity had this view from the rim of Mars' Victoria Crater about 130 feet from where controllers intend to start the rover's descent inside the crater. Image credit: NASA/JPL-Caltech
+ Full image and caption
+ Audio clips for media
+ Play Podcast

Spirit, Opportunity's rover twin, also survived the global dust storms. The rovers are 43 months into missions originally planned to last three months. On Sept. 5, Spirit climbed onto its long-term destination called Home Plate, a plateau of layered bedrock bearing clues to an explosive mixture of lava and water.

"These rovers are tough. They faced dusty winds, power starvation and other challenges -- and survived. Now they are back to doing groundbreaking field work on Mars. These spacecraft are amazing," said Alan Stern, associate administrator of NASA's Science Mission Directorate, Washington.

Victoria Crater contains an exposed layer of bright rocks that may preserve evidence of interaction between the Martian atmosphere and surface from millions of years ago, when the atmosphere might have been different from today's. Victoria is the biggest crater Opportunity has visited.

Martian dust storms in July blocked so much sunlight that researchers grew concerned the rovers' daily energy supplies could plunge too low for survival. Engineers at NASA's Jet Propulsion Laboratory, Pasadena, Calif., put Opportunity onto a very low-energy regimen of no movement, few observations and reduced communication with Earth. Skies above both rovers remain dusty but have been clearing gradually since early August.

Dust from the sky has been falling onto both rovers' solar panels, impeding their ability to collect energy from the sun. However, beneficial wind gusts removed some of the new buildup from Opportunity almost as soon as it accumulated.

Opportunity drove to the lip of Victoria Crater in late August and examined possible entry routes. This week, Opportunity has been driving about 40 meters (about 130 feet) toward its planned entry point. The route will provide better access to a top priority target inside the crater: a bright band of rocks about 12 meters (about 40 feet) from the rim. "We chose a point that gives us a straight path down, instead of driving cross-slope from our current location," said Paolo Bellutta, a JPL rover driver plotting the route. "The rock surface on which Opportunity will be driving will provide good traction and control of its path into the crater."

For its first foray into the crater, Opportunity will drive just far enough to get all six wheels in; it will then back out and assess slippage on the inner slope. "Opportunity might be ready for that first 'toe dip' into the crater as early as next week," said JPL's John Callas, rover project manager. "In addition to the drives to get to the entry point, we still need to conduct checkouts of two of Opportunity's instruments before sending the rover into the crater."

The rover team plans to assess if dust has impaired use of the microscopic imager. If that tool is working, the team will use it to observe whether a scanning mirror for the miniature thermal emission spectrometer (Mini-TES) can function accurately. This mirror is high on the rover's camera mast. It reflects infrared light from the landscape to the spectrometer at the base of the mast, and it also can be positioned to close the hole in the mast as protection from dust. The last time the spectrometer was used, some aspects of the data suggested the instrument may have been viewing the inside of the mast instead of the Martian landscape.

"If the dust cover or mirror is no longer moving properly, we may have lost the ability to use that instrument on Opportunity," said Steve Squyres of Cornell University, Ithaca, N.Y., principal investigator for the rovers' science instruments. "It would be the first permanent loss of an instrument on either rover. But we'll see."

The instrument already has provided extensive valuable information about rocks and soils in the Meridiani region where Opportunity works. "Mini-TES has told us a lot about the rocks and soils at Meridiani, but we've learned that the differences among Meridiani rocks are often too subtle for it to distinguish," Squyres said. "The same instrument on Spirit, at Gusev Crater, has a much more crucial role for us at this point in the mission because there is such diversity at Gusev." Researchers will rely heavily on a different type of instrument, Opportunity's alpha particle X-ray spectrometer, for analysis of rocks at the bright-band target layer in the crater.

The Jet Propulsion Laboratory manages the Mars Exploration Rover project for NASA's Science Mission Directorate. For images and information about the rovers, visit: http://www.nasa.gov/rovers .

RELATED MULTIMEDIA:
Audio podcast: http://www.nasa.gov/multimedia/podcasting/jpl-opportunity-20070906.html .
Broadcast-quality audio clips: http://www.nasa.gov/audience/formedia/audio/jpl-mer-20070907.html
Video file with animation, images and sound bites airing today on NASA TV.


Media contacts:
Guy Webster
Jet Propulsion Laboratory, Pasadena, Calif.
818-354-6278/5011

Dwayne Brown 202-358-1726
NASA Headquarters, Washington

2007-098

Mars Rovers Survive Dust Storms, Ready for Next Objectives

Updated Sept. 11, 2007 -- NASA's Opportunity rover completed activities on Sept. 8 and 9 as planned, so controllers sent commands instructing the rover to enter Victoria Crater today. The commands call for Opportunity to descend about 10 feet into the crater, getting all six wheels inside the rim, and then to back out and assess how much its wheels slipped on the inner slope.

PASADENA, Calif. - Two months after sky-darkening dust from severe storms nearly killed NASA's Mars exploration rovers, the solar powered robots are awake and ready to continue their mission.

Opportunity's planned descent into the giant Victoria Crater was delayed, but now the rover is preparing to drive into the 800-meter-diameter crater (half-mile-diameter) as early as Sept. 11.

Image right: Opportunity had this view from the rim of Mars' Victoria Crater about 130 feet from where controllers intend to start the rover's descent inside the crater. Image credit: NASA/JPL-Caltech
+ Full image and caption
+ Audio clips for media
+ Play Podcast

Spirit, Opportunity's rover twin, also survived the global dust storms. The rovers are 43 months into missions originally planned to last three months. On Sept. 5, Spirit climbed onto its long-term destination called Home Plate, a plateau of layered bedrock bearing clues to an explosive mixture of lava and water.

"These rovers are tough. They faced dusty winds, power starvation and other challenges -- and survived. Now they are back to doing groundbreaking field work on Mars. These spacecraft are amazing," said Alan Stern, associate administrator of NASA's Science Mission Directorate, Washington.

Victoria Crater contains an exposed layer of bright rocks that may preserve evidence of interaction between the Martian atmosphere and surface from millions of years ago, when the atmosphere might have been different from today's. Victoria is the biggest crater Opportunity has visited.

Martian dust storms in July blocked so much sunlight that researchers grew concerned the rovers' daily energy supplies could plunge too low for survival. Engineers at NASA's Jet Propulsion Laboratory, Pasadena, Calif., put Opportunity onto a very low-energy regimen of no movement, few observations and reduced communication with Earth. Skies above both rovers remain dusty but have been clearing gradually since early August.

Dust from the sky has been falling onto both rovers' solar panels, impeding their ability to collect energy from the sun. However, beneficial wind gusts removed some of the new buildup from Opportunity almost as soon as it accumulated.

Opportunity drove to the lip of Victoria Crater in late August and examined possible entry routes. This week, Opportunity has been driving about 40 meters (about 130 feet) toward its planned entry point. The route will provide better access to a top priority target inside the crater: a bright band of rocks about 12 meters (about 40 feet) from the rim. "We chose a point that gives us a straight path down, instead of driving cross-slope from our current location," said Paolo Bellutta, a JPL rover driver plotting the route. "The rock surface on which Opportunity will be driving will provide good traction and control of its path into the crater."

For its first foray into the crater, Opportunity will drive just far enough to get all six wheels in; it will then back out and assess slippage on the inner slope. "Opportunity might be ready for that first 'toe dip' into the crater as early as next week," said JPL's John Callas, rover project manager. "In addition to the drives to get to the entry point, we still need to conduct checkouts of two of Opportunity's instruments before sending the rover into the crater."

The rover team plans to assess if dust has impaired use of the microscopic imager. If that tool is working, the team will use it to observe whether a scanning mirror for the miniature thermal emission spectrometer (Mini-TES) can function accurately. This mirror is high on the rover's camera mast. It reflects infrared light from the landscape to the spectrometer at the base of the mast, and it also can be positioned to close the hole in the mast as protection from dust. The last time the spectrometer was used, some aspects of the data suggested the instrument may have been viewing the inside of the mast instead of the Martian landscape.

"If the dust cover or mirror is no longer moving properly, we may have lost the ability to use that instrument on Opportunity," said Steve Squyres of Cornell University, Ithaca, N.Y., principal investigator for the rovers' science instruments. "It would be the first permanent loss of an instrument on either rover. But we'll see."

The instrument already has provided extensive valuable information about rocks and soils in the Meridiani region where Opportunity works. "Mini-TES has told us a lot about the rocks and soils at Meridiani, but we've learned that the differences among Meridiani rocks are often too subtle for it to distinguish," Squyres said. "The same instrument on Spirit, at Gusev Crater, has a much more crucial role for us at this point in the mission because there is such diversity at Gusev." Researchers will rely heavily on a different type of instrument, Opportunity's alpha particle X-ray spectrometer, for analysis of rocks at the bright-band target layer in the crater.

The Jet Propulsion Laboratory manages the Mars Exploration Rover project for NASA's Science Mission Directorate. For images and information about the rovers, visit: http://www.nasa.gov/rovers .

RELATED MULTIMEDIA:
Audio podcast: http://www.nasa.gov/multimedia/podcasting/jpl-opportunity-20070906.html .
Broadcast-quality audio clips: http://www.nasa.gov/audience/formedia/audio/jpl-mer-20070907.html
Video file with animation, images and sound bites airing today on NASA TV.


Media contacts:
Guy Webster
Jet Propulsion Laboratory, Pasadena, Calif.
818-354-6278/5011

Dwayne Brown 202-358-1726
NASA Headquarters, Washington

2007-098

Monday, September 17, 2007

Hubble Captures Stars Going Out In Style

The colorful, intricate shapes in these NASA Hubble Space Telescope images reveal how the glowing gas ejected by dying Sun-like stars evolves dramatically over time.


The colorful, intricate shapes in these NASA Hubble Space Telescope planetary nebulae images reveal how the glowing gas ejected by dying Sun-like stars evolves dramatically over time. These four planetary nebulae all lie in our Milky Way Galaxy. Their distances from Earth are all roughly the same, about 7,000 light-years. The snapshots of He 2-47, NGC 5315, IC 4593, and NGC 5307 were taken with Hubble's Wide Field Planetary Camera 2 in February 2007. (Credit: NASA, ESA, and The Hubble Heritage Team (STScI/AURA))Ads by Google Advertise on this site

--------------------------------------------------------------------------------

Video Cams for Astronomy
Highest sensitivity available! Accessories for Astrophotography
www.procyon-sys.com
2008 Total Solar Eclipse
Travel With Explorers To View The 2008 Eclipse In China. Book Now!
www.Explorers.co.uk
Ultimate Astronomy Forum
Join the astronomy and space enthusiasts at SpaceSpot!
www.spacespot.com
Satellite View
High Resolution Images Zoom Into Anywhere Now!
globesatellite.biz
3D Planet Mars - VRMars
MER Spirit Rover Mission 3D - 4GB. VRPresents Virtual Reality Software
VRMars.com

--------------------------------------------------------------------------------
These gaseous clouds, called planetary nebulae, are created when stars in the last stages of life cast off their outer layers of material into space. Ultraviolet light from the remnant star makes the material glow. Planetary nebulae last for only 10,000 years, a fleeting episode in the 10-billion-year lifespan of Sun-like stars.

The name planetary nebula has nothing to do with planets. They got their name because their round shapes resembled planets when seen through the small telescopes of the eighteenth century.

The Hubble images show the evolution of planetary nebulae, revealing how they expand in size and change temperature over time. A young planetary nebula, such as He 2-47, at top, left, for example, is small and is dominated by relatively cool, glowing nitrogen gas. In the Hubble images, the red, green, and blue colors represent light emitted by nitrogen, hydrogen, and oxygen, respectively.

Over thousands of years, the clouds of gas expand away and the nebulae become larger. Energetic ultraviolet light from the star penetrates more deeply into the gas, causing the hydrogen and oxygen to glow more prominently, as seen near the center of NGC 5315. In the older nebulae, such as IC 4593, at bottom, left, and NGC 5307, at bottom, right, hydrogen and oxygen appear more extended in these regions, and red knots of nitrogen are still visible.

These four nebulae all lie in our Milky Way Galaxy. Their distances from Earth are all roughly the same, about 7,000 light-years. The snapshots were taken with Hubble’s Wide Field Planetary Camera 2 in February 2007. Like snowflakes, planetary nebulae show a wide variety of shapes, indicative of the complex processes that occur at the end of stellar life.

He 2-47, at top, left, is dubbed the “starfish” because of its shape. The six lobes of gas and dust, which resemble the legs of a starfish, suggest that He 2-47 puffed off material at least three times in three different directions. Each time, the star fired off a narrow pair of opposite jets of gas. He 2-47 is in the southern constellation Carina.

NGC 5315, the chaotic-looking nebula at top, right, reveals an x-shaped structure. This shape suggests that the star ejected material in two different outbursts in two distinct directions. Each outburst unleashed a pair of diametrically opposed outflows. NGC 5315 lies in the southern constellation Circinus.

IC 4593, at bottom, left, is in the northern constellation Hercules.

NGC 5307, at bottom, right, displays a spiral pattern, which may have been caused by the dying star wobbling as it expelled jets of gas in different directions. NGC 5307 resides in the southern constellation Centaurus.

Note: This story has been adapted from a news release issued by Space Telescope Science Institute.

Recent development

Recent Developments
Here are some recent developments in the field of Nanotech. Amazing as the recent developments in nano-manufacturing are, they still cannot deliver to us the miracles of the nanotechnological future. Although experimental lithographic techniques for etching semiconductor printed circuits promise a resolution of 100 nanometres or even finer, this is still far to coarse to build a true nano-scale machines of atomic scale precision. While tediously shifting around one atom at a time using incredibly expensive equipment will hardly enable the mass-producing of billions of devices. Nevertheless, the microscopic gears and motors that have already been built do at least indicate the viability of making molecular machines.



- Nanotech Links - Recent devolopments -

Test-drive for nanocopters
Fri 24 Nov 2000 - The first microscopic "helicopters", which could one day carry out medical tasks inside the body, have been built and test-driven by scientists at Cornell University. The devices are no bigger than a virus particle. They consist of metal propellers and a biological component attached to a metal post. The biological component converts the body's biochemical fuel, ATP, into energy. This is used to turn the propellers at a rate of eight rotations per second. In tests the nano-helicopters' propellers for up to 2 1/2 hours. This is an important first step towards producing miniature machines capable of functioning inside living cells. But at this stage the technology is still very inefficient. Only five of the first 400 biomotors worked properly, and it still has to be proved that the machines can function inside a living cell.

Nature's Way Might Be Path to Smaller Computer Chips
8 June 2000 - The same trick an oyster uses to make mother-of-pearl may ultimately enable researchers to "grow" ultra-miniaturized computer chips. The electrical pathways would be self-assembled like the delicate whorls of seashells, rather than etched by conventional manufacturing techniques, and would be only a fraction the size of the smallest circuit components possible today. The work represents a major technological breakthrough in the field of nanotechnology that will lead to a whole new class of nanoscale sensors.
"Nature's Way" really seems the logical path to nanotech - through simulation of nature, rather than miniturisation of industrialism. After all, the natural biosphere has been building "nanomachines" (life) for over three and a half billion years.

Scientists Discover How to Make Nanostructures Assemble Themselves - Technique Could Yield New Generation of Miniature Electronics - Princeton University - 18 November, 1999 - "Princeton researchers have created ultrasmall plastic structures with a method that is cheaper and more versatile than previous techniques. The discovery has yielded surprising insights into the behavior of materials at very small scales, while spawning many basic research questions. It also could pave the way to a new generation of miniature products, from computer memory chips and video components to devices for sorting DNA molecules..."

Yale Research on Molecular Switches May Lead to Smaller, Cheaper Computers - 18 November, 1999 - Yale and Rice University scientists have demonstrated molecular devices that act as reversible electronic switches, making it possible to build smaller computers that are less expensive.

Northwestern chemists plot the next step in nanotechnology - October 1999 - researchers at Northwestern University demonstrate a new technology that may be used to miniaturize electronic circuits, put thousands of different medical sensors on an area much tinier than the head of a pin and develop an understanding of the intrinsic behavior of ultrasmall structures -- ones comprised of a small collection of molecules patterned on a solid substrate. By miniaturizing existing writing and printing techniques,...a research team led by Chad Mirkin, Charles E. and Emma H. Morrison Professor of Chemistry and director of Northwestern's Center for Nanotechnology, has paved the way for such possibilities. In their paper (Science, Oct 15 1999), the researchers detail how they have transformed their world's smallest pen (Science, Jan. 29, 1999) into the world's smallest plotter, a device capable of drawing multiple lines of molecules -- each line only 15 nanometers or 30 molecules wide -- with such precision that only five nanometers, or about 200 billionths of an inch, separate each line. By contrast, a human hair is about 10,000 nanometers wide....It is the nano-plotter's accuracy of registration when building nanostructures of different organic molecules that could dramatically impact molecule-based electronics, molecular diagnostics and catalysis, in addition to leading to new applications not yet imagined in nanotechnology...."

Researchers at Sandia National Laboratories in Albuquerque, New Mexico in 1996 announced the creation of "intelligent" micromachines that incorporate integrated-circuit controllers, and of sinking electric motors into tiny etched trenches, enabling the fabrication of entire electromechanical systems on a chip.

Researchers at Cornell University in August 1997 unveiled the world's smallest guitar. Smaller than a single cell, the "nanoguitar" is only 10 micrometers long and really plays. The strings are only 50 nanometers wide and can be plucked by an atomic force microscope. But the nanoguitar is too small to generate sound at frequencies audible to the human ear.

In June of 1997 a team of Australian researchers managed to build a functioning nanomachine, a biosensor, a combination of biology and physics, designed to detect substances with extreme sensitivity. It consists of a synthetic membrane chemically tethered to a thin metal film coated onto a piece of plastic. This membrane behaves like the outer skin of the cells of the human body in its ability to sense other molecules. The central component of the device is a tiny electrical switch, an ion-channel, only 1.5 nonometres in size. Being cheap and easy to use, the biosensors have a huge range of potential uses, e.g. detecting drugs, hormones, viruses, pesticides, gene sequences, drugs, medically-active compounds, and more.

Nanothinc is a company providing information services concerning nanotechnology and related technologies. A huge amount of material is archived on this site. By checking reguklarily you can keep in touch with recent developments.

Recent development

Recent Developments
Here are some recent developments in the field of Nanotech. Amazing as the recent developments in nano-manufacturing are, they still cannot deliver to us the miracles of the nanotechnological future. Although experimental lithographic techniques for etching semiconductor printed circuits promise a resolution of 100 nanometres or even finer, this is still far to coarse to build a true nano-scale machines of atomic scale precision. While tediously shifting around one atom at a time using incredibly expensive equipment will hardly enable the mass-producing of billions of devices. Nevertheless, the microscopic gears and motors that have already been built do at least indicate the viability of making molecular machines.



- Nanotech Links - Recent devolopments -

Test-drive for nanocopters
Fri 24 Nov 2000 - The first microscopic "helicopters", which could one day carry out medical tasks inside the body, have been built and test-driven by scientists at Cornell University. The devices are no bigger than a virus particle. They consist of metal propellers and a biological component attached to a metal post. The biological component converts the body's biochemical fuel, ATP, into energy. This is used to turn the propellers at a rate of eight rotations per second. In tests the nano-helicopters' propellers for up to 2 1/2 hours. This is an important first step towards producing miniature machines capable of functioning inside living cells. But at this stage the technology is still very inefficient. Only five of the first 400 biomotors worked properly, and it still has to be proved that the machines can function inside a living cell.

Nature's Way Might Be Path to Smaller Computer Chips
8 June 2000 - The same trick an oyster uses to make mother-of-pearl may ultimately enable researchers to "grow" ultra-miniaturized computer chips. The electrical pathways would be self-assembled like the delicate whorls of seashells, rather than etched by conventional manufacturing techniques, and would be only a fraction the size of the smallest circuit components possible today. The work represents a major technological breakthrough in the field of nanotechnology that will lead to a whole new class of nanoscale sensors.
"Nature's Way" really seems the logical path to nanotech - through simulation of nature, rather than miniturisation of industrialism. After all, the natural biosphere has been building "nanomachines" (life) for over three and a half billion years.

Scientists Discover How to Make Nanostructures Assemble Themselves - Technique Could Yield New Generation of Miniature Electronics - Princeton University - 18 November, 1999 - "Princeton researchers have created ultrasmall plastic structures with a method that is cheaper and more versatile than previous techniques. The discovery has yielded surprising insights into the behavior of materials at very small scales, while spawning many basic research questions. It also could pave the way to a new generation of miniature products, from computer memory chips and video components to devices for sorting DNA molecules..."

Yale Research on Molecular Switches May Lead to Smaller, Cheaper Computers - 18 November, 1999 - Yale and Rice University scientists have demonstrated molecular devices that act as reversible electronic switches, making it possible to build smaller computers that are less expensive.

Northwestern chemists plot the next step in nanotechnology - October 1999 - researchers at Northwestern University demonstrate a new technology that may be used to miniaturize electronic circuits, put thousands of different medical sensors on an area much tinier than the head of a pin and develop an understanding of the intrinsic behavior of ultrasmall structures -- ones comprised of a small collection of molecules patterned on a solid substrate. By miniaturizing existing writing and printing techniques,...a research team led by Chad Mirkin, Charles E. and Emma H. Morrison Professor of Chemistry and director of Northwestern's Center for Nanotechnology, has paved the way for such possibilities. In their paper (Science, Oct 15 1999), the researchers detail how they have transformed their world's smallest pen (Science, Jan. 29, 1999) into the world's smallest plotter, a device capable of drawing multiple lines of molecules -- each line only 15 nanometers or 30 molecules wide -- with such precision that only five nanometers, or about 200 billionths of an inch, separate each line. By contrast, a human hair is about 10,000 nanometers wide....It is the nano-plotter's accuracy of registration when building nanostructures of different organic molecules that could dramatically impact molecule-based electronics, molecular diagnostics and catalysis, in addition to leading to new applications not yet imagined in nanotechnology...."

Researchers at Sandia National Laboratories in Albuquerque, New Mexico in 1996 announced the creation of "intelligent" micromachines that incorporate integrated-circuit controllers, and of sinking electric motors into tiny etched trenches, enabling the fabrication of entire electromechanical systems on a chip.

Researchers at Cornell University in August 1997 unveiled the world's smallest guitar. Smaller than a single cell, the "nanoguitar" is only 10 micrometers long and really plays. The strings are only 50 nanometers wide and can be plucked by an atomic force microscope. But the nanoguitar is too small to generate sound at frequencies audible to the human ear.

In June of 1997 a team of Australian researchers managed to build a functioning nanomachine, a biosensor, a combination of biology and physics, designed to detect substances with extreme sensitivity. It consists of a synthetic membrane chemically tethered to a thin metal film coated onto a piece of plastic. This membrane behaves like the outer skin of the cells of the human body in its ability to sense other molecules. The central component of the device is a tiny electrical switch, an ion-channel, only 1.5 nonometres in size. Being cheap and easy to use, the biosensors have a huge range of potential uses, e.g. detecting drugs, hormones, viruses, pesticides, gene sequences, drugs, medically-active compounds, and more.

Nanothinc is a company providing information services concerning nanotechnology and related technologies. A huge amount of material is archived on this site. By checking reguklarily you can keep in touch with recent developments.

Saturday, September 15, 2007

google shoots for the moon

After pioneering software that allows people to explore Earth in unprecedented detail, Google set its sights on mapping the moon. Now the popular Internet search engine wants to help send a robot there. Yesterday, the Mountain View, California-based company announced that it will sponsor a $30 million prize for the first privately funded lunar robotic rover.
"We are here today embarking upon this great adventure of having a nongovernmental, commercial organization return to the moon and explore," Google co-founder Sergey Brin said in a statement. "And I'm very excited that Google can play a part in it."

The contest is modeled on the Ansari X Prize, which offered $10 million to the first private company to reach suborbital space twice within 2 weeks using a reusable piloted vehicle. Scaled Composites in Mojave, California, won the contest in 2004 and is now building a commercial version for British entrepreneur Richard Branson's Virgin Galactic.

According to the rules of the Google Lunar X Prize, the winner must be the first to land a spacecraft on the moon that can travel at least 500 meters and send back data, images, and video of the surface. Peter Diamandis, X Prize chair and chief executive officer of the X Prize Foundation, the organization running the Google prize, explained in a statement that he wants to encourage low-cost missions to the moon in order to open the frontier beyond Earth orbit. "It could be another 6 to 8 years before any government returns," he said. "Even then, it will be at a large expense, and probably with little public involvement."

NASA recently canceled a series of rovers--each of which it estimated would cost hundreds of millions of dollars--because of budget constraints. Diamandis says he hopes to lure innovators who normally would not be attracted to typical government space contracts.

A host of government efforts to reach the moon are already under way. Today, Japan's space agency launched a mission from Tanegashima Island. A Chinese spacecraft is slated for liftoff by the end of this year, and a NASA flight will follow next year. All three missions involve orbiters, however, meaning no craft will actually land on the lunar surface. China is also considering sending a rover to the moon early in the next decade in preparation for a possible human landing.

Related site


Google Lunar X Prize