Spiraling costs of mining, extracting and harnessing non-renewable energy sources are compelling individuals and institutions to explore the viability of using renewable sources of energy as an alternative. Though many individuals feel that the US started to look for alternative sources to generate electricity for powering homes and cities as well as for catering to transportation needs after the 1970 ‘energy crisis’, attempts to manufacture devices for tapping sunlight, wind, and geothermal heat began about a century back. It was in the year 1876 that Richard Day along with his teacher William Grylls Adams first generated electricity from the sun’s rays using selenium cells.
Cut back to 1953 when three Americans design the first prototype of a solar or photovoltaic cell for the first time using silicon. This cell was powerful enough to generate sufficient electricity to run a small electrical device. In 1965, solar cells are available for sale but are prohibitively expensive. These cells slowly and gradually were being used to play a radio or power a toy. In the subsequent decades the two reigning superpowers USA and USSR (now Russia and CIS) extensively harnessed solar cells to running space programs and satellites. From the seventies of the last century, widespread use of the photo cells leads to a massive reduction in their unit costs-from $100 to about $20 for every watt.
Oil rigs situated off-shore heavily used solar-powered lighting to illuminate tops of the towers. Subsequently, these cells began to be used for heating water in homes and offices, for heating tap water, for running automobiles, and for lighting up homes. Solar power is being used for flying planes, supplying electricity to desert areas, for powering microwave towers, running watches and calculators, heating up microwave ovens and so on and so forth. With every passing day, photovoltaic cells are finding new applications.