READ ahead before trying
this lab, especially the first three sections. IMPORTANT:
Answer the pre-questions before you attempt the lab.
This lab really isn't so much about physics as it
is about analysing complex data sets. The data set we choose
to analyse is from the lab Mathematica Simulation of a particle
in a 3-D Box with Infinite Potential Walls.
You'll need the file from the lab Mathematica
Simulation of a particle in a 3-D Box with Infinite Potential
Walls. I think you saved it as later.ma. You'll also
need to copy the file binfiles.ma to your disk or the mathematica
directory.
BACKGROUND
We considered a 3-D box of dimensions L. Inside
the box the potential is zero. Outside the box the potential
is infinity. To visualize that data, I had you use the
Scatterplot module to make a 3-D plot of the probability
of finding a particle in the box for a certain state. Actually,
if you think about it, this is actually four dimensional data:
Probability (one dimension) versus x, y, and z (three additional
dimensions). Perhaps you should look at those plots again.
The plots we made presented the probability of finding the particle
at location x, y, z, in the box. The density of points in the
space represented the probability. The type of visualization
is called a density plot or scatter plot.
A high density of points indicated a high probability. Of course,
there are many other visualization techniques you can use to look
at 4 (and even higher) dimensional data.
One common way is to create a large four-dimensional
data file is with a three dimensional matrix with its elements
representing the values in the fourth dimension. (The fourth
dimension must be the only dependent variable.) If you sample
the volume of interest by, say, 256 points in each direction the
matrix representing that data would be a 256 x 256 x 256 matrix.
Then sophisticated software applications can be used to slice
and dice the sampled volume. Often, with the use of colour
look-up tables, the software can assign colours to the dependent
dimension.
Slices in a data set are just that. Imagine taking
a cube of butter and injecting it with red dye. How does the
dye diffuse in the butter? One way to exam the diffusion pattern
is to take slices off one side, and slide by slice, you can note
the pattern the dye makes as it is uncovered. This is aptly called
slicing. In a set of real data, you can look at several
slices at once, and assigning the transparency of each slice,
sample the data at several locations throughout the data set.
Dicing encompasses several
techniques.
One is looking at surfaces of constant value. These
are called isosurfaces. For the physicist, perhaps the
most familiar example are the surfaces of equipotential we learn
about in electromagnetism. For instance, the isosurfaces of voltage
(or equipotential surfaces) about a point charge are spheres.
Returning to the dye-in-the-butter example, suppose we can accurately
determine the concentration of the 0.1 grams of dye per cm3
with your eyes. You take a scalpel and carefully carve away the
butter along the surface the has that concentration. You are
left with, beside a large mess, the isosurface of 0.1 grams of
dye per cm3 the diffused in the butter. In terms of
real data, you can 'draw' several isosurfaces representing different
values of the data and assign the transparency of each. This
way one can see into the center of the data.
Another techniuque, commonly called alpha-channel
dicing makes use of the colour channels (the red-channel,
green-channel, and blue-channel) and the transparency
factor or alpha-channel. In this method, one edits the
colour tables so they fit only in values of interest. For instance,
in computing the probability of finding a particle in a 3-D quantum
well, one is not especially interested in probabilities between
about 0.0 and 0.4. One edits the colour tables such that, for
instance, red-channel peaks at 1.0 and falls off to zero at 0.7.
The green-channel starts at 0.8, peaks at 0.6, and returns to
zero value at 0.5. The blue channel begins at 0.7, rises to
peak at 0.5 and falls to zero again at 0.4. The alpha-channel
is edited so high probability has a high alpha-value so it is
opaque. The alpha-channel may fall linearly to 0.5 so that blue
and green (low probability) are transparent. This is an oversimplified
example. Editing the colour tables is not difficult, but usually
requires much trial and error to create an easy-to-interpret visualization
of the data.
Projections of the data
onto a surface (flat or otherwise) are also possible. Projections
don't really fall into either the slice or dice category. The
way to think of a projection is to consider nested glass bowls
held between an overhead projector and a screen. One sees the
projection of the surfaces of the bowls on the screen. By editing
the alpha channel and colour tables (see the alpha-channel
dicing example), one can create surfaces (isosurfaces or a
continuum of surfaces) with different 'optical' properties. A
simulated white light source is applied to the data volume and
projected onto a screen. The white light projection can be programmed
such that the light picks up the dominate colour it encounters
along each path taken in the volume.
Of course, moving animations can be made from slices
or dices. For instance, one could create slices in the y-z plane
in intervals along the x-axis. Then linking the slices in an
animation, one my see the slices in succession. Moreover, if
the interval between slices are small, the animation would show
the data in the y-z plane as successive shavings are taken off
along the x-axis. Another example is to create isosurfaces and
then render the volume at successively different viewing angles.
These different views can be linked together to form a rotating
isosurface, a flyby of the surface, or even a journey around and
through the surface. The ways that the techniques can be combined
into an animation are only limited by the imagination and the
available computer time.
LEARNING ABOUT CREATING A DATA FILE
I have provided you with a module called binfiles.ma.
This handly little module provides the tools to take single streams
of bytes and concatenate them into a binary file. The idea is
that you calculate a portion of larger computation, save the results
in the file, and then compute more chunks. The module contains
the tools needed to create large binary files. The function
IncrementalScaleTool[1,255,globalmin,globalmax,data]
takes a list in a variable data which has
values in the range between globalmin and globalmax
and rescales it to the range 1 to 255. The result is to create
a list of 8-bit bytes. This way, when the list is saved, it will
be a standard 8-bit binary file. Since the final computation
will contain a great number of values, its important to keep the
files small. A binary file is about as compressed as you can
get (without fancy compression software). The other tool is the
function
IncrementallyBuildBinaryFile[filename,
data, chunknumber]
which takes the variable data, which is expected
to have the 8-bit binary range of 1 to 255, and writes it to a
filename called filename. The chunknumber must
be set to 1 the first time that the tool is called so that the
function first creates the file. The function returns the value
of chunknumber + 1 so that its easy to call this function
repeatedly.
EXAMPLE OF CREATING A LARGE BINARY DATA FILE
The except below computes the function psi2 in a
volume layer-by-layer, saving them after each layer is complete
before moving down to the next layer. Specifically, psi2 is computed
in the y-z-plane for 52 values of x. The results are saved.
Then, the plane is moved one increment in y and, again, psi2 is
computed in the new y-z-plane for 52 values of x. This is repeated
52 times until all values in the x-y-plane are computed for one
values of z. This whole procedure is repeated for 52 increments
in z.
<<binfiles.ma
SetDirectory["c:/"];
filename = "well111.dat";
numberofsteps = 51;
L = 10;
globalmax = N[0.01];
globalmin = 0;
ixmin = 0
ixmax = L
ixstep = (ixmax-ixmin)/numberofsteps
iymin = 0;
iymax = L;
iystep = (iymax-iymin)/numberofsteps;
izmin = 0;
izmax = L;
izstep = (izmax-izmin)/numberofsteps;
chunknumber = 1;
totalnumberofpoints = (numberofsteps+1)^3;
Print[ "Estimate of Binary Data
File Size: ",totalnumberofpoints];
begintime = AbsoluteTime[];
computation = 0;
Do[
Do[
data = Table[ 0, {ix,ixmin,ixmax,ixstep}];
xvals = Table[ ix, {ix,ixmin,ixmax,ixstep}];
i = 1;
Do[
data[[i]] = N[psi2[ xvals[[i]],iy,iz,nx,ny,nz, norm] ];
computation = computation
+ 1,
{i,1,Length[data];}];
data=Flatten[data];
data = IncrementalScaleTool[1,255,globalmin,globalmax,data];
Do[ If[ Not[NumberQ[data[[i]] ] ],
Print[" data is not numerical at x = ",ix," y = ",iy," z = ",iz," its ",data[[i]]," !."]],
{i,1,Length[data]}];
chunknumber = IncrementallyBuildBinaryFile[filename,
data, chunknumber],
{iy,iymin,iymax,iystep}];
Print[ N[ Length[data]^2 * ((iz-izmin)/izstep+1) /totalnumberofpoints * 100],
" % Done, Elapsed Time (Seconds): ", AbsoluteTime[] - begintime ],
{iz,izmin,izmax,izstep}]
Print[ "Actual number of Expected File Length: ", (chunknumber - 1) * (Length[data]),
" Total Elapsed Time (Seconds):
", AbsoluteTime[] - begintime ];
Some explanations:
The statements
Print[ "Estimate of Binary Data
File Size: ",totalnumberofpoints];
begintime = AbsoluteTime[];
will print out an estimate the final file's size
and start a timer, while the statements
Print[ N[ Length[data]^2 * ((iz-izmin)/izstep+1) /totalnumberofpoints * 100],
" % Done, Elapsed Time (Seconds): ", AbsoluteTime[] - begintime ],
{iz,izmin,izmax,izstep}]
Print[ "Actual number of Expected File Length: ", (chunknumber - 1) * (Length[data]),
" Total Elapsed Time (Seconds):
", AbsoluteTime[] - begintime ];
will print out the elapsed time and % done as the
computation proceeds and the actual final file size and total
elapsed time for the complete calculation.
Just in case the variable data isn't a list
(a vector) the statement
data=Flatten[data];
forces it to be so. As a precaution the statement
Do[ If[ Not[NumberQ[data[[i]] ] ],
Print[" data is not numerical at x = ",ix," y = ",iy," z = ",iz," its ",data[[i]]," !."]],
{i,1,Length[data]}];
will let the user know if a data point is not numerical.
This will help troubleshoot problems and warn you of errors.
Pre-Questions:
0) You might have to refer to the tutorials to remember DO loops. Explain what the following DO loop does. If you aren't not sure of your answer, try it.
Do[
Print[ (i + 1)^2 ],
{i, 0, 10}]
THE REAL WORK
This document is on the web and you are welcome to copy the Mathematica statements and paste them to your own notebook. You will, however, need to make some slight modifications. This should save time and angst.
<<Sctrplot.ma
and
stcplt = ScatterPlot[1,1,1,0,L,10,10000,1000];
Save your work!
psi2[x_,y_,z_,nx_,ny_,nz_, norm_] :
= 1/norm ...
At the end of later.ma, add a cell that will
change the directory to your directory and read in the binfiles.ma
notebook. Here, I'm assuming that the disk and directory that
binfiles.ma is c:/yourdirectory.
This will also be where your data file will
be saved to. Please edit c:/yourdirectory
so that it points to the disk you want it
to (good choices are your student directory or a floppy (a:/).
Be sure that the disk you choose has at least 1 MB of free space.
Note: you might want to change the filename if to match the
quantum numbers you choose. Try not to overwrite any files you
want to keep!
SetDirectory["c:/yourdirectory"];
<<binfiles.ma
filename = "well1.dat";
numberofsteps = 51;
globalmax = N[0.01];
globalmin = 0;
L = 10;
ixmin = 0
ixmax = L
ixstep = (ixmax-ixmin)/numberofsteps
iymin = 0;
iymax = L;
iystep = (iymax-iymin)/numberofsteps;
izmin = 0;
izmax = L;
izstep = (izmax-izmin)/numberofsteps;
chunknumber = 1;
totalnumberofpoints = (numberofsteps+1)^3;
Print[ "Estimate of Binary Data
File Size: ",totalnumberofpoints];
begintime = AbsoluteTime[];
computation = 0;
Do[
Do[ data = Table[ 0, {ix,ixmin,ixmax,ixstep}];
xlen = Length[data];
xvals = Table[ ix, {ix,ixmin,ixmax,ixstep}];
i = 1;
Do[ data[[i]] = N[psi2[ xvals[[i]],iy,iz,nx,ny,nz,
norm] ];
computation = computation +
1,
{i,1,xlen}];
data=Flatten[data];
(*
Print["computation # ",computation, " Length of data: ",Length[data], " xvals,iy,iz: ", xvals," ",iy," ",iz," ",
data];
*)
If[ Length[data] != numberofsteps+1, Print["WARNING!!!!!!!!!!! Length[data] is ",
Length[data]," and numberofsteps+1
is ",numberofsteps+1]];
Do[ If[ Not[NumberQ[data[[i]] ] ],
Print[" data is not numerical at x = ",i," y = ",iy," z = ",iz," its ",data[[i]]," !."]],
{i,1,Length[data]}];
data = IncrementalScaleTool[1,255,globalmin,globalmax,data];
chunknumber = IncrementallyBuildBinaryFile[filename,
data, chunknumber],
(*
Print["After save: computation # ", computation, " Length of data ",Length[data],
" xvals,iy,iz: ", xvals," ",iy," ",iz," ",BaseForm[data,16]],
*)
{iy,iymin,iymax,iystep}];
Print[ N[ Length[data]^2 * ((iz-izmin)/izstep+1) /totalnumberofpoints * 100],
" % Done, Elapsed Time (Seconds): ", AbsoluteTime[] - begintime ],
{iz,izmin,izmax,izstep}]
Print[ "Actual number of Expected File Length: ", (chunknumber - 1) * (Length[data]),
" Total Elapsed Time (Seconds):
", AbsoluteTime[] - begintime ];
Save your work!
CREATING DATA FILES
4-D VISUALIZAION USING SLICER
Next, hit the survey button. Slicer will computer the maximum and minimum values in the entire file. If there is something wrong in the file, hitting this button will probably find it. If the survey process is successful and if the min and max values are about 0 and 240, you are in great shape. Press the 'OK' button.
50.00 | 0.05 |
80.00 | 0.08 |
110.00 | 0.10 |
140.00 | 0.20 |
170.00 | 0.30 |
200.0 | 1.00 |
Of course, you can choose your own opacity values, but I find this choice to be fairly good.
ANALYSING YOUR OWN DATA