19 November, 2011

Protect IP act and SOPA

PROTECT IP Act, or (Preventing Real Online Threats to Economic Creativity and Theft of Intellectual Property Act of 2011) was introduced on May 12, 2011 by Senator Patrick Leathy to give the government and copyright holders additional tools to curb access to "rogue websites dedicated to infringing or counterfeit goods".
Here you can find all the information about this bill, and why it should be stopped.





This bill will not be able to stop piracy, but will only give to government  bigger censorship powers.

For instance a website that got blocked, will be reached anyway just using the IP addres of the site. The website won't be reached by search engines, but this won't stop the activity of the site, because it could be found using different methods. Quoting Eric Schmidt, it's just an attempt to use easy solution to complex problems.

This bill will however allow the government to remove access to a certain domain, and all the hyperlinks related to it. All the social networks and search engines will therefore censor their users, because they could get closed if any of the user would post any copyright-breaking material.

Nevertheless this bill could be seen an example by other countries, that could start to apply censorship in a similar way. If all the countries in the world would apply those laws, the web would be different for every nation. The biggest form of communication of the 21th century would therefore die.

18 November, 2011

The atomic structure [5]: Heisenberg uncertainty principle

Bohr and Sommerfeld atomic models couldn't be considered exhaustive, because were based on classical mechanics, and introduced postulates, without explaining them. Experimental results weren't properly interpreted as well, and in cases different from Hydrogen, energy levels were completely different from the one proposed by those models.


The problem that was being posed now was if it could be possible to elaborate models that required the existence of precise orbits. To calculate the path of a certain dot, it necessary to know its position and speed at a certain moment.


In response to this problem Heisenberg proposed the uncertainty principle, according to which is not possible to measure accurately and contemporaneously the position and the speed of a certain particle
It means that lower is the radiation's intensity used to observed the particle, grater is the accuracy of the measurement, because the particle's path is going to be effected in a minor way. However to observe the path of an electron, a radiation of wavelength comparable to its dimension is required.  This implies a very small wavelength, that is related to high energy, and therefore great change in particle's speed, that  is several cases, will be sufficient to ionize the atom.

17 November, 2011

The atomic structure [4]: Wave-particle duality

In 1922 Compton began to study electromagnetic waves, with special interest on the interaction between X rays and a graphite target. He observed that a radiation was emitted, and its wavelength was higher than the incident one. This phenomena couldn't be explained by classical theories. They expected that the emitted radiation was characterized by the same frequency of the incident one, because it was supposed to make graphite particle oscillate with its frequency.
 Compton proposed that this phenomena could be explained as an interaction between particles: photons and electrons. Photons, interacting with graphite, are going to hit electrons, shifting part of they energy. The electron is therefore going to be emitted with a lower kinetic energy than the photon. It means that also the frequency will be lower, thus the wavelength will be higher, explaining the Compton scattering.


This new discovery, together with photoelectric effect, showed that light behaved ad a particle. De Broglie, was firmly convinced about unity in nature: if waves behaved like matter, also matter had to behave like waves. In 1924 he proposed that the motion of particles was related to the propagation of a certain wave, according to his hypothesis:
This was experimentally proved in 1927 by Davidson and Germer. For instance if an electromagnetic wave is forced to pass through a hole with radius comparable to it's wavelenght, it will create diffraction. The same behavior is observed if electrons are forced to pass through a hole of radius comparable to their de broglie-wavelength, confirming wave particle duality of light and matter.

16 November, 2011

Is it worth saving pandas?

Conservation has limited resources, and so pragmatic choices need to be made. No one  would ever want to hear of any species getting extincted, especially if the species involved are well known, and loved, by the public, but it could result to be necessary.


Those species are indeed very expensive to keep going, and most of the resources in this area are used toward them and a few others, while the best thing to do could be to preserve biodiversity hotspots. If habitat are not preserved, there's no point in talking about preserving biodiversity. So if we all the cash were not spent on those famous species but were used, for example, to buy rainforest, biodiversity might get greater benefits.




On the other hand megafauna like pandas and tigers appeal to people's emotional side, and attract a lot of attention, raising the possibility of higher money resources. It can lead to a media phenomena called single-species conservation. Those kind of advertising began in the seventies with Save the Tiger, Save the Panda, Save the Whale, and so on, but maybe this era has come to an end.


Nevertheless many species that could be "worth saving" live in a narrowly defined habitat. This mean that they don't need a big habitat to live in, and so the protected area would be restricted. In conservation terms is therefore better to try to protect the species that live at higher levels in the food web. Thus conservation will be extended to all the other species related to the protected one. 


Furthermore protecting those species  will require  the conservation of larger habitats, than the one required by "lower" species. Megafauna could hence be used as media vehicle for the habitat conservation. There are things you pull out from the picture because people can relate to them. And it does make a difference.

15 November, 2011

The atomic structure[3]:the Bohr model

Spectral emission lines of atoms resulted to be formed by electromagnetic waves varying in a discrete way. The easier spectrum to  be calculated is Hydrogen, because its atom is composed by one only electron. The formula required to calculate the frequency of those lines was proposed by Rydberg in 1888.
The problem related to the interpretation of those experimental results was solved by Niels Bohr in 1913, that modified Rutherford's model, according to Quantum Mechanics. The main ideas are:
  1. Electrons move around the nucleus in circular orbits (or elliptical, according to Sommerfeld) under the action of coulombian force;
  2. Electrons cannot orbit in every orbit, but only in the ones with a certain angular momentum;
  3. If electrons orbit in one of the allowed orbits, it doesn't radiate energy;
  4. The atom emits energy and radiations only when electrons pass from an higher energy stationary state, to a lower one.

Bohr succeeded in calculating the value of atomic radius for Hydrogen, introducing quantization of energy in the relation between coulombian and centripetal force.

There are therefore infinite allowed stationary state, because n is a natural number. This infinity is a discrete one, and not a continue one, like in classical mechanics. This result is a direct consequence of the quantization of angular momentum, that involves also the quantization of the energy of the electron.
This atomic model can be however applied only for the Hydrogen atom. For more complex atoms it is necessary to introduce another quantum number. Spectral lines of those atoms presented multiplets, that imply the presence of further energy levels, closer to each other. 

It was explained by Sommerfeld, introducing elliptical orbits. Those theories were still based on a classical idea of atom, that is far from being correct, even thought the secondary quantum number is used also on wave quantum mechanics, and represents the shape of the orbital.

The atomic structure[2]: the birth of quantum mechanics

By the end of the 19th century, Maxwell's theory of electromagnetism wasn't able to explain several phenomena, such as:
  1. Black body radiation;
  2. Photoelettric effect;
  3. Compton scattering;
  4. Emission and adsorption spectrum of atoms.
Those phenomena were interpreted using quantum mechanics, that was based on the idea that light could behave in a particle way, and not only as a wave. Max Plank was the first one to introduce the idea of quantization, that allowed to explain black  body radiation.



If we take in account any metal that melts at high temperatures, we'll see that he emits radiation in a continuous spectrum, that varies with temperature. Black body is a theoretical object that behaves in a similar way: it adsorbe all the incoming radiation, and its emission depends only on temperature. Its spectrum will present a maximal emission at a certain wavelength, that will vary according to Wien's Law.

In 1899 Max Plank succeeded in interpretating the experimental results about black body radiation. He assumed that energy could not vary in a continuous way, but that it was quantized.

In 1905 use this new idea to solve the paradox of Photoelectric effect. This effect was based on the fact that electrons are emitted from matter as a consequence of their absorption of energy from electromagnetic radiation of very short wavelength, such as visible or ultraviolet light. On experiments it was observed that electron emission took place only if the incident radiation had a bigger frequency than a certain minimum one (typical of the irradiated body). The emitted electrons had a certain kinetic energy, that varied from zero, to a maximum value, dependent from the frequency of the radiation. The intensity of emitted electrons was proportional to incident radiation's intensity, meanwhile their speed (and so their kinetic energy) was independent from it.



According to classical physics, electrons of superficial layers could be stimulated  by incident radiation, but the speed of emitted electrons should vary proportionally with the intensity of incident radiation. This was in contrast with experimental results, so a new conception of electromagnetic waves was needed.

Einstein proposed that electromagnetic waves, while interacting with matter, had a corpuscular behavior,  as if it was composed by quanta of light, photons. While intensity of radiation increases, photon's energy remains the same, and gains the number of photons per unity of surface. A photon can indeed give his energy to surface electron: if it is bigger than the minimum required, the electron will be emitted, and will assume a kinetic energy equal to the difference of photon's kinetic energy and the minimum energy required.

This explains why to an increasing intensity of incident radiation, corresponds a gaining of emitted electrons, while their kinetic energy is not  changed. This interpretation led to the introduction of wave particle duality of light.

13 November, 2011

The atomic structure [1]: the discovery of fundamental particles

Until the second half of the 18th century, the most widely accepted idea of atom was Dalton's one. His theory was based on 5 fundamental points:
  1. Everything is composed by invisible and indivisible particles, called atoms. This idea was present also in Greece over 2500 years earlier, because of Democrito.
  2. Atoms cannot be created and cannot be destructed;
  3. In a certain element, all the atoms are equal, sharing the same mass and chemical properties;
  4. Different elements are made by different atoms, with different mass and chemical properties;
  5. Different atoms can combine each other to form more complex particles
It revealed to be false at the beginning of 19th century, when it was discovered that atoms weren't fundamental particles, but were composed by three other particles: electrons, protons and neutrons.



Electrons were the first ones to be discovered, because of Goldstein's researches  on cathode rays. Those one were emitted on a tube filled with rare gas and inside a strong electric field by the negative electrode (cathode), and appeared to cause fluorescence to certain elements. The fact that those particles got deflected by magnetic fields made Goldstein realize that they were negative-charged. They also resulted to form with every possible gas used., and independently by the material the cathode was made by.

In 1897 Thomson succeeded in calculating the charge/mass ratio. Later Millikan succeeded in calculating also the real value of its charge. Those discoveries led Thompson to hypothesize a model in which the atom was made by a positive charged nucleus, with electrons distributed on its surface, to justify the global neutral charge.



This atomic model resulted to be wrong in 1911, because of Geiger-Mardsen experiment. A source of alpha particles was disposed in the front of a thin sheet of gold, with a fluorescent screen, to analize their direction.  The result was that 99% of particles passed through the sheet without any deviation, a little amount were deflected, and only few were reflected.



Rutherford hypothesized that atoms were mainly empty, while most of the mass was concentrated in the nucleus, which is positively charged. Electrons had to orbit around the nucleus, in a way similar to planets one.They were on stationary state, in equilibrium between coulombian and gravitational force.  Rutherford calculated also that  the protons that made the nucleus were only half of the mass of the atom, and that electrons contributed in a very limited way. Another neutral massive particle was required, the neutron. It was discovered only in 1932 by Chadwick, because of neutral high energy radiation emitted by nucleus.

Rutherford's atomic model was however in contradiction with classical laws of electromagnetism. A charged particle moving will indeed slowly loose its energy, emitting electromagnetic waves. This is why an electron orbiting around a nucleus would loose its energy in a split second, not allowing the existence of any stable atom.

11 October, 2011

The photosynthetic slug

Along the East coast of North America lives a small mollusc, the sea-slug Elysia chlorotica, which is able to embody chloroplasts belonging to the alga Vaucheria litorea, on which it feeds. This gastropod can use these organelles to survive for long periods without food, using sugars produced by the photosynthesis process. During the juvenile period this slug needs to feed on the alga, before chloroplasts are stably incorporated in his cells.


However, the most interesting fact about Elysia chlorotica is its ability to maintain chloroplasts alive for several months. This is due to the fact that it doesn't only embody the organelles, but it also integrates in the genome the alga's genes that support the chloroplasts in their photosynthetic function. This process is usually called "horizontal gene transfer", since it implies that an organism incorporates some genes of another organism without being part of its offspring, and is common in different bacteria species.

10 October, 2011

Warm-blooded reptiles?

Since the first fossils of dinosaurs have been found, many scientists agreed that these creatures had a metabolism similar to that of modern reptiles. This means that they were ectothermic (internal temperature equal to that of the external environment). However, other following discoveries led to consider the opposite hypothesis, the endothermy.

The major doubts about dinosaur's mechanisms of thermoregulation arise when we consider the species that lived at polar latitudes. In fact, there are different sites in Australia or Russia where it has been discovered a great variety of dinosaurs, which in some cases lived at temperatures close to zero degrees. As an example in Northern Russia, near Kakanaut, many fossils have been extracted from rocks dating back the Cretaceous, of both carnivores (troodontids, dromaeosaurids, tyrannosaurids) and herbivores (hadrosaurids, ankylosauria and others). There have even been found some hadrosaur eggs, which proves that these animals had a sedentary nature.

A recent research by Holly Woodward, Jack Horner and colleagues at Montana State University (published online on PLoS ONE), proved that dinosaurs who lived at polar latitudes weren't physiologically different from other species, on the contrary of what claimed an earlier study, influenced by the lack of a good number of finds.

The endothermy hypothesis could also be supported by the presence of feathers on many dinosaur species, especially on theropods like dromaeosaurids. This group includes the notorious Velociraptor, and is probably a parallel evolutionary line to that of birds, with whom they could share a common ancestor. Structures like feathers or hair are typical in endothermic animals and are important in thermoregulation, as they act insulating the body and making it less influenced by changes in the environmental temperature.

Another study, published on "Science" and guided by researchers of Bonn University and the California Institute of technology, discovered what the internal temperature of sauropods was. This enormous herbivores had a body temperature similar to that of modern mammals, between 36 °C and 38 °C. They analyzed the teeth of these creatures, which contain carbonates made by different isotopes of carbon and oxygen. Since the temperature at which this compound is formed influences the percentage of 13C and 18O that bind together within the tooth, the higher the temperature was, the lower the frequency of this bond was. Analyzing the quantity of these isotopes, they determined the body temperature of these animals.

All these discoveries, but many others too, seem to prove that the most diversified and successful group of reptiles had a physiology similar to that of mammals, a fact that probably allowed them to colonize almost all the available environments.

09 October, 2011

Thermonuclear fusion: the confinement of plasma [3]

Stars shine because of the conditions of high temperature and pressure of their nucleus. This is called gravitational confinement, but for technical reasons is not reproducible on Earth. To reach the required conditions, other forms of confinement have been proposed, which use higher temperature and lower pressure. Nowadays nuclear reactors are divided in two categories, Inertial and Magnetic confinement.

Inertial Confinement

In those reactors, plasma is obtained thanks to the use of high energy laser. Small spheres containing mixtures of Deuterium and Tritium are placed in a vacuum chamber.


Several laser rays hit those spheres, causing the evaporation of the plastic case called "Ablator". Deuterium and Tritium are shoved toward the geometrical center, reaching high density and temperature.


The reactor shown in figure is the National Ignition Facility, built in California in 2009. It is made of 192 lasers and at the moment it doesn't produce more energy than what it consumes. However, this project has just began and it could soon reach a great energetic efficiency.

Magnetic Confinement

Even though inertial confinement could soon reach a great energy efficiency, magnetic confinement seems to be closer to reach this aim. This reactor is based on the principle that plasma is composed by charged particles that are affected by Lorentz force.


The intensity of this force depends in fact on the speed of the particle and on its charge. It is equal to zero if the charge is zero as well, or the speed is parallel to the magnetic field. On the other hand, if they are perpendicular it is maximum. The direction of the force is going to be perpendicular to the speed and to the direction of the magnetic field: it won't be able to change the particle's speed, but only its direction. A perpendicular magnetic field would result in a circular trajectory, while all the other will lead to an helicoidal trajectory, described by the following equation:

According to those laws, a possible solution to confine a plasma is the use of a solenoid and a magnetic mirror. However, the efficiency of this geometry is not even close to the one of a toroid.


According to this model, several types of reactors have been proposed and tested. The most efficient one resulted to be the "Stellarator", which uses further helicoidal spins to the ones of the toroid. This different geometry allows, for instance, to eliminate the assial current necessary in any other toroid to create the poloidal field (as we can see in the previous figure).


The main problem related to "Stellarator" is its extreme complexity. This model is indeed only a theoretic proposal and no one has been built yet. More simple reactors are based on the "Tokamak" geometry, which is the most widely used nowadays.

08 October, 2011

Thermonuclear fusion: deep inside the heart of stars [2]

The process of Thermonuclear fusion requires high temperatures and high pressure. Those conditions until now  have been found only inside the hearth of stars. Here the elements are under the physical form of a ionized gas. It means that the  different particle have a certain electric charge, and so before nuclear interaction could begin to work, coulombian repulsion has to be defeated.



Considering the simplest situation, with two hydrogen atoms, the energy required to exceed the coulombian barrier is 1000 KeV. Inside a star the temperature is usually around 10^7 K. Heat energy of mono-atomic gas can be calculated as follow:
According to this data, nuclear reaction should be impossible at those condition. However three other factors can combine to bring a certain probability of success for those reactions:
  • Particles are characterized by a Maxwell speed distribution. It means that a certain amount of particles have a energy greater than the medium one, and a certain amount can reach the level required;
  • According to quantum mechanics there's a little probability that a particle with low energy could  exceed the coulombian barrier, by Quantum Tunnelling;
  • Stars are made by a large amount of particles. Even though the medium level isn't enough to exceed the barrier, a great amount of particles could have enough energy.

Nevertheless most of the energy produced by a star is related to the Proton-Proton chain, that occurs at 3x10^7 K. The reaction is basically the transformation of 4 proton in a nucleus of Helium, according to the following process:


The total energy produced by this process is 26 MeV per cicle, even though 0,26 MeV are emitted under  the form of particles known as Neutrino.


This cycle is prevalent in star only through the first period of their life. When temperature gets closer to 10 million degrees another chain takes place, and it is known as CNO chain (Carbon Nitrogen Oxigen).
This chain consists, like the previous one, in the transformation of 4 protons in an atom of Helium. In total the energy production is similar to the previous one, 25 MeV. So what are the factors that mark the difference between one chain and the other? In the CNO chain heavier atoms are used as catalyst. Because of that higher energies are require to exceed the coulombian barrier, and so also higher temperatures.

Anyhow the process that is fundamental for  the production of energy by thermonuclear fusion on earth is the p-p chain. The CNO chain in fact requires higher temperatures, too high for our actual technologies. Consequently the following article will always refer to this cycle.

The hidden twin of the Amazon River


In 2007 a group of brazilian geologists of the Coordenação de Geofísica do Observatório Nacional, guided by the Indian scientist Valiya Hamza discovered the world's longest underground river at a depth of 4 km under the Amazon river.


There is an enormous amount of water, flowing very slowly for 6000 km, from the Andes to the Atlantic Ocean. With a width between 100 and 200 kilometers, the river "Hamza" is probably the world's biggest groundwater. Hamza and his team analyzed datas coming from several oil wells dug by the Petrobras company, between 1970 and 1980.


The origins of this river is probably linked to the clash between the South-American plate and the one on which rests the Pacific Ocean. The peculiar porous and permeable rocks on the eastern ocean rim  allow water to flow towards greater depths. Then impermeable layers block the rise of water and the topography gradient (the angle of the slope on which current flows) leads it to flow along the same direction of the Amazon River.

Thermonuclear Fusion: beyond particle physics [1]

The process on which is base thermonuclear fusion is very simple, and concerns two light atoms, fusing in a heavier one. But how could this process take place?


If we consider two Hydrogen atoms, their nucleus consist of one proton. This means that they will be affected by a gaining repulsive electromagnetic force while getting closer. According to Coulomb law this force is:

If this was the only force holding together the nucleus, it couldn't even exist. Gravity on the other hand is attractive, but it is not strong enough. In fact if we compare it to electromagnetic force, we'll see that it is 10^39 times stronger.


The standard model of particle physics postulates the existence of two more forces:  weak and strong nuclear forces. Those kind of forces do not affect every particle, and their radius of effect is very little: 10^-15 for the strong one and 10^-18 for the weak one. This explains why those forces are not familiar to us, and don't have any noticeable effect on macroscopic world. 



The standard model proposed also that every force has a quantum mediator, represent by a certain particle. For instance electromagnetic force is mediated by photons, while gravity is supposed to be mediated by gravitons. We'll later focus on strong nuclear force, fundamental for thermonuclear fusion, and their mediators gluons, but before we need to explain what are those particles.


Particles are divided in several categories:
  • Leptons: those particles are fundamental. It means that they aren't made up by other particles and aren't affected by strong nuclear force, but only by the weak one. Are divided in three other categories: electron, muon, tau. Everyone of them has a corresponding neutrino.
  • Hadrons: those are massive particles, affected by all the 4 fundamental forces. Nowadays are known more than a hundred of those particles, even though proton is the only stable one. Are made up by quarks, and are divided in two further categories, Baryons an Mesons

If we take in consideration Pauli exclusion principle, according to which:
"Two fermions cannot occupy simultaneously the same quantum state"
We have to divide particles in other two categories:
  • Fermions: particles with one half spin, obeying to Pauli principle;
  • Bosons:  particles with integer spin, that do not obey to Pauli principle. Are also known as quantum mediator.
Strong force is based for instance on Bosons called Gluons, that obey according to the laws of quantum chromodynamics. Unlike the electrically neutral photon of quantum electrodynamics (QED), gluons themselves carry color charge and therefore participate in the strong interaction in addition to mediating it.


Barions are composed of 3 quarks with certain color charge. The various possible combinations determinate which particles gets formed. This color charge is not static, and changes continuously. This phenomena allows the creation of strong interactions between Hadrons, and therefore their coexistence inside the nucleus.


Strong nuclear force allows the creation of atomic nucleus. It is deeply related to binding energy, which is defined as:
"The energy required to break the bond between protons and neutrons inside the nucleus."
This energy is different for every nucleon, because depends on mass number. It increases with mass number for elements lighter than iron, and then decreases. The fusion of two lighter atoms will then lead to the formation of a heavier and more energetic atom, allowing the liberation of a certain amount of energy.




The mass of the resulting atom will be in fact lower than the sum of the two lighter ones. According to Einstein's special relativity this mass is transformed in energy.

07 October, 2011

Possible solutions to energy crisis

Through the last decades, the problem related to energy production has reached very concerning levels. The progressive depletion of coal reserves requires the development of further sources of energy production. Nowadays, the main alternatives that have been identified are:

  • Nuclear Power;
  • Wind;
  • Solar thermal;
  • Solar photovoltaic;
  • Geothermal;
  • Hydroelectric.
The problem is that none of them could compete with fossil fuel, because of their low efficiency and expensiveness. The only plausible solution at the moment seems to be nuclear power.


Energy produced by nuclear fission can be indeed considered the most efficient form of energy at the moment, but only if we do not consider long term cost. Nuclear waste need long periods of secure storage to avoid any environmental risk. Their hazard leads to a raise of costs related to the production of this kind of energy, that will eventually make this energy source inconvenient.

Even though on short-term its efficiency is the highest, it necessarily decreases with time because of the storing of nuclear waste. Nuclear debt has resulted to be a great problem, bothering the financial budget of several countries.

Another problem is related to the supply of uranium. Like coal, also uranium is a fossil fuel, and this means that it wont last forever. In this field many theories have been proposed: someone says it's going to last for 40 years, someone else for 400 years and so on, but this is not the main matter. Uranium has limited supply, and this means that it cannot be a definitive solution.

However, recently another possible form of energy has been proposed, and it's the one that allows the sun to shine. We are talking about Thermonuclear Fusion.

06 October, 2011

From raw materials to nanotechnology

Nowadays many people know what a CPU is, but only few of them can describe how to build millions of microscopic transistors and place them in order to make them work together.


From sand to “ingots”…

Sand is a very common material on Earth’s surface and it is composed by 25% of Siliceous.

The first thing to do is to separate siliceous from sand. This purification process requires several passages to grant a high level of purity (the maximum accepted is 1 atom per million).Purified Siliceous is then melted to obtain a pure Siliceous crystal called “ingot” that weights 100 kg and is 99,9999% pure.






From “ingots” to “wafers” and first treatments…



The ingot is then cut in many thin discs called “wafers”, which are thoroughly cleaned in order to eliminate every defect.
Generally modern CPUs are obtained from 300 mm diameter wafers.







A blue photoresist liquid is then distributed on their surface in order to prepare them to the next step. During this phase, the wafer is constantly turned around itself in order to grant an uniform liquid distribution (as we can see in the picture).









The wafer is then exposed to UV radiations, which are partially filtered by a mask (as we can see in the picture). The parts which have been exposed to UV ray become soluble. Through the mask, it is possible to give a precise shape to the Siliceous and through a lent it is possible to obtain a smaller shape on the wafer than the one on the mask.
In fact, the shape on the mask is bigger than the real one on the wafer; this procedure enables to create microscopic transistors from bigger molds.





From “wafers” to transistors…


In the picture we can see a transistor’s enlarged image. A transistor works as an interrupter, as it is able to control the electric current inside itself.
Nowadays transistors are so small that it’s possible to place 30 millions on the head of a pin.
The areas exposed to UV rays are now dissolved and eliminated using a specific chemical solvent. This is the first step to the CPU building. Those which haven’t been exposed to UV rays are now covered by a photoresist, which protects the ones to be preserved. The photoresist film is eventually removed and the outcome is similar to the one in the picture.


Subsequently, a new photoresist film is applied and the transistor is exposed again to UV rays and to a new washing phase. This step is called “ion doping”, because it consists on the ionized particles exposition, which modifies chemical properties of siliceous. This is fundamental to create the necessary properties for CPUs.
The next step consists of an ion bombing on the exposed areas of the wafer and it’s called “ion implantation” (ions are shot at 300.000 km/h). After that ions are planted in siliceous altering its chemical properties.
Photoresist layer is removed after ions bombing and the exposed material (the green one) now contains new atoms.


Now the transistor is almost finished and three holes are applied on the isolation layer (magenta) on the transistor. These holes will be covered by copper, which is fundamental to link each transistor with the other ones.





Copper ions are settled on the transistor (the “elettroplanting”). The wafer is then plunged in a CuSO4 solution, which is subjected to an electric field. So, copper ions move from the positive terminal (anode) to the negative one (cathode).

The outcome is a thin copper layer on the wafer and the excess material is removed.








Several metal layers compose “think wires” between transistors, which are extremely important for CPU
structure and functions.
If we watch a CPU by naked eye, we’ll probably observe that it’s very thin, but if we watch it through a microscope we acknowledge that it has 20 transistor’s circuit layers.





From transistors to “dies”…

Now the CPU building is complete, but the product has to be tested: every transistor and circuit is controlled in order to check the right functioning of the whole system.



If its quality has been granted, the wafer is cut in different unities called “die”. Properly functioning dies are taken and the other ones are tossed.

Now the die is placed in its case with a heatsink and...



a new CPU is born!