You are on page 1of 31

Seminar report 2004

COLLEGE OF ENGINEERING KIDANGOOR (Under the co-operative academy of professional education) (CAPE) Established by Govt of Kerala DEPARTMENT OF ELECTRONICS & COMMUNICATION ENGINEERING

AJEESA P.K HOLOGRAPHIC MEMORY

DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING OCTOBER 2004

Department of Electronics & Communication

College of engg, Kidangoor

Seminar report 2004

COLLEGE OF ENGINEERING KIDANGOOR (Under the co-operative academy of professional education) (CAPE) Established by Govt of Kerala DEPARTMENT OF ELECTRONICS & COMMUNICATION ENGINEERING

CERTIFICATE

Date:

Certified that the paper entitled HOLOGRAPHIC MEMORY is prepared & presented by AJEESA P.K of 7th semester B.Tech in Electronics & Communication Engineering, toward the partial fulfillment of the requirement for the award of the degree of Bachelor of Technology, by the Cochin University of Science & Technology.

GUIDE

HOD

CO-ORDINATOR

Department of Electronics & Communication

College of engg, Kidangoor

Seminar report 2004

ACKNOWLEDGEMENT

First and foremost we thank God Almighty for his divine grace and blessings in making all this possible. May he continue to lead us in the years to come.

We are grateful to our respected principal, Dr K.venkateswara Mallan, for his guidance and support. We extend deep gratitude to our internal guide, Mr Jentil Jose, Lecturer in Electronics & Communication, whose careful and meticulous reviews improved the quality of this work.

We express our sincere thanks to the HOD of Electronics and Communication for his support and guidance and thanks to the entire cast and crew of Electronics & Communication for their co-operation .Last, but not the least, we would like to thank all our friends and the parents who supported us with their love and encouragement for the completion of the seminar.

AJEESA P.K

Department of Electronics & Communication

College of engg, Kidangoor

Seminar report 2004

ABSTRACT
Although conventional storage methods adapt to the growing needs of computer system, they are reaching their fundamental limits. Often improvements made to these storage methods decrease access times or reduce the size of stored bits, but the design of such systems is based on serial access ,or reading in a one-dimensional streams of bits. Conventional storage also relies on mechanical devices to retrieve data, such as the arm which passes over magnetic platters in a hard drive. As computer systems continue to become faster, they will need a way to access larger amount of data in shorter periods of time. For this purpose a three-dimensional data storage system which has fundamental advantages over conventional read /writes memory system known as holographic memory was developed.

Department of Electronics & Communication

College of engg, Kidangoor

Seminar report 2004

INDEX

Chapter 1 Introduction. Chapter 2 2.1 Hardware for holographic data storage. 2.2 How data is stored? 2.3 Features of holographic memory. 2.4 Coding and signal processing. 2.5 Sources of noise and distortion. 2.6 Recording materials. 2.7 Summary of polymer work. Chapter 3 Holographic memory vs. existing memory technologies. Chapter 4 4.1 Possible applications. 4.2 Future developments. Chapter 5 Conclusion. Reference Appendix

Department of Electronics & Communication

College of engg, Kidangoor

Seminar report 2004

CHAPTER 1

Department of Electronics & Communication

College of engg, Kidangoor

Seminar report 2004

1. INTRODUCTION
As processors and buses roughly double their data capacity every three years, data storage has struggled to close the gap. CPUs can perform an instruction execution every nanosecond, which is six orders of magnitude faster than a single magnetic disk access. Much research has gone into finding hardware and software solutions to closing the time gap between CPUs and data storage .Some of these advances include cache, pipelining, optimizing compilers, and RAM.

As the computers evolve, so do the applications that computers are used for. Recently large binary files containing sound or image data have become commonplace ,greatly increasing the need for high capacity data storage and data access .A new high capacity form of data storage must be developed to handle these large files quickly and efficiently.

Devices that use light to store and read data have been the backbone of data storage for nearly two decades. Compact discs (CD) revolutionized data storage in the early 1980s, allowing multimegabytes of data to be stored on a disc that has a diameter of a mere 12 centimeters and a thickness of about 1.2 millimeters .In 1997, an improved version of the CD, called a Digital Versatile Disc (DVD), was released, which enabled the storage of full length movies on a single disc.

CDs and DVDs are the primary data storage methods for music, software, personal computing and video. A CD can hold 783 megabytes of data, which is equivalent to about one hour and 15 minutes of music, but Sony has plans to release a 1.3 giga bytes (GB) high capacity CD .A double sided, double layer DVD can hold 15.9 GB of data, which is about eight hours of movies. These conventional storage mediums meet todays storage needs, but storage technologies have to evolve to keep pace with increasing consumer demand.CDs, DVDs and magnetic storage all store bits of information on the surface of a recording medium.

Department of Electronics & Communication

College of engg, Kidangoor

Seminar report 2004

However, both magnetic and conventional optical data storage technologies, where individual bits are stored as distinct magnetic or optical changes on the surface of the recording medium, are approaching physical limits beyond which individual bits may be too small or too difficult to store .Storing information through out the volume of a medium not just on its surface offers an intriguing high capacity alternative.

In order to increase storage capabilities, scientists are now working on a new optical storage method, called Holographic memory that will go beneath the surface and use the volume of the recording medium for storage, instead of only the surface area. Three-dimensional data storage will be able to store more information in a smaller space and offer faster data transfer times.

Holographic memory is a volumetric approach, which although conceived decades ago, has made recent progress toward practicality with the appearance of lower cost enabling technologies, significant results from longstanding research efforts, and progress in holographic recording materials.

Holographic memory offers the possibility of storing 1 terabyte (TB) of data in a sugar-cube sized crystal. A Terabyte of data equals 1,000 gigabytes, 1 million megabytes or 1 trillion bytes. Data from more than 1,000 CDs could fit on a holographic memory system. An advantage of holographic memory is that an entire page of data can be retrieved quickly and at one time.

Department of Electronics & Communication

College of engg, Kidangoor

Seminar report 2004

CHAPTER 2

Department of Electronics & Communication

College of engg, Kidangoor

Seminar report 2004

10

2: SUBJECT DETAILING

2.1 HARWARE FOR HOLOGRAPHIC DATA STORAGE

Most important hardware components include the SLM used top imprint data on the object beam ,two lenses for imaging the data onto a matched detector array ,a storage material for recording volume holograms, and a reference beam intersecting the object beam in the material. Other components include laser source ,beam forming optics for collimating the laser beam, beam splitters for dividing the laser beam into two parts, stages for aligning the SLM and the detector array, shutters for blocking the two beams when needed ,and wave plates for controlling polarization ,abeam steering system directs the reference beam to the storage material.

Fig 1: A photographic of a 1 x 4094 Beam Steering Spatial Light Modulator and magnified view of the grating
structure of the SLM.

2.2 HOW DATA IS STORED?

In holographic data storage, an entire page of information is stored at once as an optical interference pattern within a thick, photosensitive optical material. This is done by intersecting two coherent laser beams within the storage material. The first, called the object beam, contains the information to be stored; the second, called the reference beam, is designed to reproduce a simple collimated beam with a planar wave front.

Department of Electronics & Communication

College of engg, Kidangoor

Seminar report 2004

11

The resulting optical interference pattern causes chemical and /or physical changes in the photosensitive medium. A replica of the interference pattern is stored as a change in the absorption, refractive index, or thickness of the photosensitive medium. When the stored interference grating is illuminated with one of the two waves that were used during the recording; some of this incident light is diffracted by the stored grating in such a fashion that the other wave is reconstructed. Illuminating the stored grating with the reference beam wave reconstructs the object wave and vice versa.

Fig 2: In holographic memory device, a laser beam is split in two, and the two resulting beams
Interact in a crystal medium to store a holographic recreation of a page of data

A large number of these interference gratings or patterns can be superimposed in the same thick media and can be accessed independently, as long as they are distinguishable by the direction or the spacing of the gratings. Any particular data page can then be read out independently by the stored gratings with the reference wave that was used to store that page.

Department of Electronics & Communication

College of engg, Kidangoor

Seminar report 2004

12

The data to be stored is imprinted on to the object beam with a pixilated input device called a spatial light modulator (SLM); typically, this is a liquid crystal panel. A spatial light modulator is used for creating binary information out of the natural light.

It is a 2D plane, consisting of pixels which can be turned on and off to create binary 1s and 0s.A spatial light modulator contains a two dimensional array of windows, which are only few microns wide. These windows block some part of the incoming laser light and let others go through. The resulting cross section of the laser beam is a two dimensional array of binary data.

To retrieve data without error ,the object beam must contain a high quality imaging system-one capable of directing this complex optical wave front through the recording medium, where the wave front is stored and then later retrieved ,and then onto a pixilated camera chip .The image of the data page at the camera must be close as possible to perfect. Any optical aberrations in the imaging system or the misfocus of the detector array would spread energy from one pixel to its neighbours.Optical distortions (where pixels on a square grid at the SLM are imaged to a square grid) or errors in the magnification will move a pixel of the image off its intended receiver ,and either of these problems (blur or shift) will introduce errors in the retrieved data .Here ,the reconstructed data page propagates backward through the same optics that were used during the recording ,which compensates for most shortcomings of the imaging system.

2.3 FEATURES OF HOLOGRAPHIC MEMORY

In order for the holographic technology to be applied to computer systems, it must store data in a form that a computer can recognize. In current computer systems, this form is binary. In computer applications, this manipulation is in the form of bits.

In addition to high storage density, holographic data storage promises fast access times, because the laser beams can be moved without inertia. With the inherent parallelism of its page wise storage and retrieval, a very large compound data can be reached by having relatively slow, and therefore low-cost, parallel channels.

Department of Electronics & Communication

College of engg, Kidangoor

Seminar report 2004

13

Another feature of holographic memory is the associative retrieval: imprinting a partial or search data pattern on the object beam and illuminating the stored holograms reconstructs all of the reference beams that were used to store data.

The intensity that is diffracted by each of the stored gratings into the corresponding reconstructed reference beam is proportional to the similarity between the search pattern and the content of that particular data page. By determining which reference beam has the highest intensity and then reading the corresponding data page with this reference beam, the closest match to the search pattern can be found without initially knowing its address.

Because of all advantages and capabilities, holographic storage has provided an intriguing alternative to conventional data storage techniques for three decades. However, it is the recent availability of relatively low-cost components, such as liquid crystal displays for SLMs and solid state camera chips from video camcorders for detector arrays, which has led to the current interest in creating practical holographic data storage devices.

Desirable performance specifications include data fidelity as quantified by biterror rate (BER), total system capacity, storage density, readout rate, and the life time of stored data.

The optical system with two lenses separated by the sum of their focal lengths is called the 4-f configuration, since the SLM and the detector array are four focal lengths apart. The 4-f system allows the high numerical apertures needed for high density. In addition, since each lens takes a spatial Fourier transform of the SLM data, which is then Fourier transformed again upon readout by the second lens. This has several advantages: point defects on the storage material do not lead to lost bits, but results in slight loss in signal-to-noise ratio at all pixels; and the storage material can be removed and replaced in an offset position, yet the data can still be reconstructed correctly .In addition, the Fourier transform properties of the 4-f system lead to the parallel optical search capabilities offered by the holographic associative retrieval.

Department of Electronics & Communication

College of engg, Kidangoor

Seminar report 2004

14

The disadvantages of the Fourier transform geometry come from the uneven distribution of intensity in the shared focal plane of the two lenses.

2.4 CODING AND SIGNAL PROCESSING

In a data storage system, the goal of coding and signal processing is to reduce the BER to a sufficiently low level while achieving such important figures of merit as high data rate and high density. This is accomplished by stressing the physical components of the system well beyond the point at which the channel is error free, and then introducing coding and signal processing schemes to reduce the BER to levels acceptable to users. Although the system retrieves raw data from the storage device with many errors (a high raw BER), the coding and the signal processing ensure that the user data are delivered with an acceptable low level of error (a low level user BER).

Coding and signal processing can involve several quantitatively distinct elements .The cycle of user data from input to output can include interleaving ,error correcting code (ECC) and modulation encoding ,signal preprocessing ,data storage in the holographic system ,hologram retrieval ,signal post processing ,binary detection ,and the decoding of the interleaved ECC.

The ECC encoder adds redundancy to the data in order to provide protection from various noise sources .The ECC encoded data are then passed on to a modulation encoder which adapts the data to the channel. It manipulates the data into a form less likely to be corrupted by channel errors and more easily detected at the channel output. The modulated data are then input to the SLM and stored in the recording medium. On the retrieving side, the CCD returns the pseudo-analog values (typically camera count values of eight bits) which must be transformed back into digital data(typically one bit per pixel).The first step in this process is a post processing step, called equalization ,which attempts to undo distortions created in the recording process ,still in the pseudoanalog domain .Then the array of pseudo-analog values is converted to an array of binary digital data via a detection scheme.The array of digital data is then passed first to a modulation decoder ,which performs reverse operation of the modulation encoding, and the to the ECC decoder .

Department of Electronics & Communication

College of engg, Kidangoor

Seminar report 2004

15

2.5 SOURCES OF NOISE AND DISTORTION

This section shows the several sources of noise and distortion and indicates how the various coding and signal processing elements can help in dealing this.

a. Binary detection.

The simplest detection scheme is threshold detection, in which a threshold T is chosen. Any CCD pixel with intensity above T is declared 1, while those below T are assigned to class 0.Within a sufficiently small region of the detector array, there is not much variation in the pixel intensity. If the page is divided in to several such small regions, and within each region the data patterns are balanced (i.e.; have same number of 1s and 0s), detection can be accomplished without using the threshold.

One problem with this method is that the array detected by sorting may be a valid codeword for the modulation code .In this case, one must have a procedure which transforms balanced arrays into valid codeword more complex, but rather more accurate scheme than sorting is correlation detection .In this scheme, the detector chooses the codeword that achieves maximum correlation with the array of received pixel intensities.

b. Interpixel interference

It is a common phenomenon in which intensity at one particular pixel contaminates data at nearby pixels.Physically; this arises from optical diffraction or aberrations in the imaging system. The extent of interpixel interference can be quantified by the point spread function, sometimes called as a PSF filter. If the channel is linear and the PSF filter is known, the interpixel interference can be represented as a convolution with the original (encoded) data pattern and then undone in the equalization step via a filter inverse to the PSF filter.

Department of Electronics & Communication

College of engg, Kidangoor

Seminar report 2004

16

Deconvolution has the advantage that it incures no capacity overhead (code rate of 100%).However it suffers from mismatch in the channel model, inaccuracies in the estimation of the PSF, and enhancement of random noise. An alternative method to combating interpixel interference is to forbid certain patterns of high spatial frequency via a modulation code.

A code that forbids a pattern of high spatial frequency is called a low-pass code. such codes constraint thee allowed pages to have limited high spatial frequency content. A general scheme for designing such codes is via a strip encoding method in which each data page is encoded, from top to bottom, in narrow horizontal pixel strips. The constraint is satisfied both between the strip and the neighboring strips.

c. Error correction

In contrast to modulation codes, which introduce a distributed redundancy in order for the binary detection of pseudo analog intensities, error correction incorporates explicit redundancy in order to identify decoded bit errors. An ECC code receives a sequence of decoded data with an unacceptably high raw BER,ansd uses the redundant bits to correct errors in the user bits and reduce the output user BER to a tolerable level ( typically less than 10^12).The simplest and best known error- correction scheme is parity checking ,in which bit errors are identified because they change the number of ones in a given block from odd to even .Most of the work on ECC for holographic storage has focused on more powerful Reed Solomon (RS) codes. These codes have been used successfully in a wide variety of applications for two reasons: 1) They have very strong error correction power relative to the required redundancy, and 2) their algebraic structure facilitates the design and implementation of fast, low complexity decoding algorithms.

In a straight forward implementation of an ECC, such as an RS code, each byte would be written into a small array, and the bytes in a codeword would simply be rastered across the page.

Department of Electronics & Communication

College of engg, Kidangoor

Seminar report 2004

17

There might be 250 bytes per codeword. If the errors are independent from pixel to pixel and identically distributed across the page, this would work.

Assume that ECC can correct two bytes errors per codeword. If the code words are interleaved so that any cluster error can contaminate at most two bytes in each codeword, the cluster error will not defeat the errors correcting power of the code. Interleaving schemes such as this have been studied extensively for one-dimensional applications.

For certain sources of error, it is reasonable to assume that the raw BER distribution is fixed from the hologram to hologram. Thus the raw BER distribution across the page can be accurately estimated from the test patterns. Using this information, codeword can be interleaved in such a way that not too many pixels with high raw BER can lie in the same codeword .This technique, known as matched interleaving, can significantly improve the user BER.

d.Predistortion

This method is developed by IBM Almaden.It works by individually manipulating the recording exposure of each pixel on the SLM ,either through control of exposure time or by relative pixel transmission Deterministic variations among the ON pixels ,such as those created by fixed-pattern noise ,non uniformity in the illuminated object beam, and eve3n interpixel cross talk ,can be suppressed. Many of the spatial variations to be removed are present in an image transmitted with low power from the SLM directly to the detector array. Once the particular pattern of nonuniform brightness levels is obtained, the recording exposure for each pixel is simply calculated from the ratio between its current brightness value and the desired pixel brightness.

At low density, raw BER improvements of more than 15 orders of magnitude are possible. More significantly, at high density, interpixel cross talk can be suppressed and raw BER improved from 10^4 to10^12.

Department of Electronics & Communication

College of engg, Kidangoor

Seminar report 2004

18

e. Gray scale

The previous sections have shown that the coding introduced to maintain acceptable BER comes with an unavoidable overhead cost, resulting in somewhat less than one bit per pixel. The predistortion technique described in the previous section makes it possible to record data pages containing gray scale .Since we record and detect more than two brightness levels per pixel, it is possible to have more than one bit of data per pixel. To encode and decode these gray scale data pages, we also developed several local-threshold methods and balanced modulation codes.

f. Capacity estimation

To quantify the overall storage capacity of different gray scale encoding options, we developed an n experimental capacity estimation technique. In this technique, the dependence of raw BER on readout power is first measured experimentally. This method then produces the relationship between M, the number of holograms that can be stored, and the raw BER.

In general, as the raw BER of the system increases, the number of holograms, M increases slowly. In order to maintain a low user BER as this raw BER operating point increases, the redundancy of the ECC code must increase .Thus, while the number of holograms increases, the ECC at which the user capacity is maximized .For the Reed Solomon ECC codes we commonly use, this optimal raw BER is approximately 10^3.The use of three gray levels offered a 30% increase in both capacity and readout rate over conventional binary data pages.

Department of Electronics & Communication

College of engg, Kidangoor

Seminar report 2004

19

2.6 RECODING MATERIALS

Properties of holographic data storage can be broadly characterized as optical quality, recording properties, and stability.Thewse directly affect the data density and capacity that can be achieved; the data rates fur input and output, and the BER.

For highest density at low BER, the imaging of the input data from the SLM detector must be nearly perfect, so that each data pixel is read cleanly by the detector. The recording medium itself is part of the imaging system and must exhibit the same high degree of perfection .Furthermore, if the medium is moved to access different areas with the readout beam; this motion must not compromise the imaging performance. Thus, very high standards of optical homogeneity and fabrication must be maintained over the full area of the storage medium. With sufficient materials development effort and care in fabrication, the necessary optical quality has been achieved for both inorganic photorefractive crystals and organic photopolymer media.

A more microscopic aspect of optical quality is intrinsic light scattering of the material .The detector noise floor produced by scattering of the readout beam imposes a fundamental minimum on the efficiency of the stored data hologram, and on thus on the storage density and data rate of readout .

Because holography is a volume storage method ,the capacity of a holographic storage system tends to increase the as the thickness of the medium increases, since greater thickness implies the ability to store more independent diffraction gratings with higher selectivity in reading out individual data pages without cross talk from other pages stored in the same volume. For the storage densities necessary to make holography a competitive storage technology, a medium thickness of at least a few millimeters is highly desirable.

Holographic recording properties are characterized in terms of sensitivity and dynamic range. Sensitivity refers to the extent of refractive index modulation produced per unit exposure (energy per unit area).

Department of Electronics & Communication

College of engg, Kidangoor

Seminar report 2004

20

Diffraction efficiency6 is proportional to the square of the index modulation times the thickness.Thus; recording sensitivity is commonly expressed in terms of the square root of diffraction efficiency. The unit of sensitivity is cm/J.

The term dynamic range refers to the total response of the medium when it is divided up among many holograms multiplexed in a common volume of material. Dynamic range has strong impact on the data storage density that can be achieved.

Stability is a desirable property for any data storage system. In the case of holographic storage, the response of the recording medium, which converts the optical interference pattern to a refractive index pattern (the hologram), is generally linear in light intensity and lacks the response threshold found in bistable storage media such as magnetic films. Much basic research in holographic storage has been performed using photorefractive crystals as storage media these crystals, Fe-doped lithium niobate has been used. Its sensitivity is sufficient for demonstration purposes, but lacks a factor of 100 for practical applications.

Stability in the dark over long periods is also an issue; organic photopolymer materials are often subjected to aging processes caused by residual reactive species left in the material after recording or by stresses built up in the material during recording. Erasure may occur because of residual thermal diffusion of the molecules which record the hologram. Stability in dark depends on the trapping of these carriers with trap energies that are not thermally accessible at room temperatures.

Many kinds of materials are investigated as holographic storage media. Five materials are compared based on optical quality, scattered light level, holographic fidelity, sensitivity, dynamic range ,stability, and available thickness. Photopolymers are very promising because of their high sensitivity and dynamic

range.Phenanthrenequinone-doped polymethylmethaacrylate (PQ/PMMA) has excellent optical quality and is based on a photoreaction between the dopant and polymer followed by diffusion of unreacted chromophore.

Department of Electronics & Communication

College of engg, Kidangoor

Seminar report 2004

21

Materials

Image quality

Scatter Holographic Sensitivity Dynamic Stability Thickness fidelity +++ ++ + + 0.02 0.02* range 1 1* 0 ++ 10 10

LiNbO3:Fe LiNbO3 (two color) Polaroid photopolymer PQ/PMMA

+++ ++

+++

20

1.5

0.5

+ 0

+ ++

0.20.5 0.002-0.02

++

2 0.1

Bayer photo +++ addressable polymer

Table 1: Comparison of properties of prospective materials for holographic data storage.

2.7 SUMMARY OF POLYMER WORK

Polymer materials are important candidates for holographic storage media. They promise to be inexpensive to manufacture while offering a wide variety of possible recording mechanisms and materials systems. The opportunity for fruitful development of polymer holographic media is thus very broad and a variety of approaches to using organic materials for holography have been pursued, including organic photorefractive materials, triplet sensitized photo chromic systems,photoaddressable polymers ,and materials which produce index modulation via material diffusion. Of these PQ/PMMA is a polymer glass in which a photoreaction binds the Phenanthrenequinone chromophore to the PMMA.This material has excellent optical quality of the PMMA matrix, it is available in reasonable thickness, and its sensitivity is reasonably good. However, the current need for lengthy thermal treatment makes it unacceptable for most storage applications.

The diffusion -driven photopolymer systems offer very high sensitivity and need no such post exposure processing .

Department of Electronics & Communication

College of engg, Kidangoor

Seminar report 2004

22

The magnitude of this refractive index modulation can be very high, resulting in a high dynamic range.Another class of organic materials undergoing rapid development is the photo-addressable polymer systems. A recording media of this type has high dynamic range, and thus the potential for high data storage density, and perhaps be reversible, thus enabling rewritable storage.

Department of Electronics & Communication

College of engg, Kidangoor

Seminar report 2004

23

CHAPTER 3

Department of Electronics & Communication

College of engg, Kidangoor

Seminar report 2004

24

3. HOLOGRAPHIC MEMORY vs. EXISTING MEMORY TECHNOLOGY

In memory hierarchy, holographic memory lies somewhere between RAM and magnetic storage in terms of data transfer rates, storage capacity, and data access times. For green light, the maximum theoretical storage capacity is 0.4 Gbits/cm^2 for a page size of 1 cm*1cm.Also holographic memory has as access time near 2.4 s, a recording rate of 31 KB/s, and a readout of 10 GB/s.Modern magnetic disks have a transfer rates of 5 to 20 MB/s.Typical DRAM today has an access time close to 10 to 40 ns, and a recording rate of 10 GB/s

Storage medium Holographic memory Main memory Magnetic disk

Access time 2.4s 10-40 ns 8.3 ms

Data transfer rate 10GB/s 5MB/s 5-20 MB/s

Storage capacity 400 Mbits/cm^2 4.0 Mbits/cm^2 100 Mbits/cm^2

Table 2: It shows the comparison of access time, data transfer rates and storage capacity for three types of memory; holographic, RAM, and magnetic disk

Holographic memory has an access time somewhere between main memory and magnetic disk, a data transfer rate that is in order of magnitude better than both main memory and magnetic disk, and a storage capacity that is higher than both main memory and magnetic disk. Certainly if the issues of hologram decay and interference are resolved , then holographic memory could become a part of the memory hierarchy , or take the place of magnetic disks much as magnetic disks has displaced magnetic tape for most applications .

Department of Electronics & Communication

College of engg, Kidangoor

Seminar report 2004

25

CHAPTER 4

Department of Electronics & Communication

College of engg, Kidangoor

Seminar report 2004

26

4.1 POSSIBLE APPLICATIONS

There are many possible applications of holographic memory. Holographic memory systems can potentially provide the high speed transfers and large volumes of future computer systems .The important applications include data mining and petaflop.

Data mining is the process of finding patterns in large volume of data. Data mining are used greatly in large databases which hold possible patterns which cant be distinguished by human eyes due to vast amount of data. Some current computer systems implement data mining, but the mass amount of storage required is pushing the limits of current data storage systems. The many advances in access times and data storage capacity that holographic memory provides could exceed conventional storage and speed up data mining considerably. This would result in more located patterns in a shorter amount of time.

Another possible application is in petaflop computing. A petaflop is a thousand trillion floating point operations per second. The fast access in extremely large amounts of data provided by holographic memory systems could be used in petaflop architecture. Optical storage such as holographic memory provides a viable solution to the extreme amount of data, which is used for petaflop computing.

Department of Electronics & Communication

College of engg, Kidangoor

Seminar report 2004

27

4.2 FUTURE OF HOLOGRAPHIC MEMORY

Optoware Corp of Laspan is developing a holographic disk with a 200-Gbyte storage capacity, and a record /play drive with a data transfer rate of 100 Mbits /s.The drive is expected to ship in 2002.Holographic recording technology was developed before current optical disks, but it was never commercialized; a host of problems prevented it from making the jump from lab to store. The system required mounting on a massive anti-vibration stand, the optics were difficult to miniaturize, and the cubical recording media had to be fabricated with extreme flatness and side-side parallelism. And, of course, it lacked compatibility with existing digital video disks (DVD).

Polarized light control

Optoware has developed three technologies ,namely (1)a single beam record /play mechanism to make it possible to miniaturize the optics,(2) servo-technology to absorb drive vibration and distortions in the media surface ,and (3) support for media removable and random access through the development of a disk media with embedded address and other information.

The singlebeam, mechanism (1) has been named polarized collinear holography, and supports holographic recording with only a single optical disk object lens. The most common approach has been to illuminate the recording media with two different laser beams, one for the signal and for the reference, each with a different angle of incidence the interference pattern being what was recorded.

This approach meant that two laser beams from a light source had to be split, with each beam traveling through each optical path, object lens and mirror, making miniaturization difficult.

The technique developed by Optoware uses a polarized beam splitter (PBS) to mix the signal and reference beams, and to pass them both through the same object lens. Interference between the two occurs in the coaxial state.

Department of Electronics & Communication

College of engg, Kidangoor

Seminar report 2004

28

The key component to separate the signal and the reference beam s for play back is a polarization control device called a composite optical rotator, which is placed immediately before the object lens. Two optical rotators ( in each half) are connected in the center so that ,for example, the left half rotates a polarized surface 45 degree counterclockwise and the right half 45 degree clockwise. Interference occurs between the beams when the polarization is off by 0 degree or 180 degree, and does not occur when it is off by 90 degree.

The reference (P) polarized beam and the signal (S) polarized beam are offset from each other by 90 degree and mixed by the PBS,then passed through the composite optical rotator. Interference occurs between left half of the signal beam and the right half of the reference beam, and their opposites, creating the interference pattern on the recording layer .For playback only the reference beam is input, and then it is stripped from the reflected light by the PBS to leave only the signal beam, which is captured by the imaging device.

Simpler Servos

As a result of the simpler optical system, a servo mechanism similar to the used in current optical disk drives can be adopted .This, in turn, contributes to lower media prices. A focusing servo, for example, makes it possible to adjust focusing to match the uneven thickness of the disk media. A tracking servo eliminates the need for vibration damping.

Two beam courses a green and a red laser beam are used. The green laser is for holographic recording and the red for servo control signals, as well as for playing DVD. At present the optical power of the prototype during recording is high, at several hundred W at the recording layer, but the firm claims the major obstacles have been cleared for slashing this down to several Mw.

Department of Electronics & Communication

College of engg, Kidangoor

Seminar report 2004

29

CHAPTER 5

Department of Electronics & Communication

College of engg, Kidangoor

Seminar report 2004

30

5. CONCLUSION

The page access of data that holographic memory provides will provide a window into next generation computing by adding another dimension to stored data. Finding holograms in personal computers might be a bit longer off, however.The large cost of high tech optical equipment would make small-scale systems implemented with holographic memory impractical. Holographic memory will likely be used in next generation supercomputers where cost is not a much as an issue. Current magnetic storage devices remain far more cost effective than any other media on the market. As computer systems evolve, it is not unreasonable to believe that magnetic storage will continue to do so. The parallel nature of holographic memory has many potential gains on serial storage methods. However, many advances in optical technology and photosensitive materials need to be made before we find holograms in computer systems. Holographic memory is very close to becoming a reality. The basic theory behind it has been shown to be reliable and has been implemented in numerous experiments.

For holographic memory to truly become the next revolution in data storage, data transfer rates must be improved, hologram decay must become negligible, and hologram recording time must be reduced. Then it will be economical for holographic memory to be produced for mass consumption.

Department of Electronics & Communication

College of engg, Kidangoor

REFERENCES:
WWW.HOWSTUFFWORKS .COM WWW.PHOTONICS .COM

You might also like