X-Ray Data Analysis

 

Previous Results

Unpacking Data
First off, the data from the x-ray run must be transferred to Idun, this is done using WinSCP. Connect to Idun and transfer the files over (typically stored in /files2/data/micromegas/Currentdate_or_month/). Copy the files unpack and unpackall from any of the folders into your new folder. From here, edit unpackall and put in all of the .dat file names. Then run unpackall. This calls unpack, which calls unpackdatanew and gemread from the ~/gemanal/ folders. This *should* give you .out files for all of the .dat files.

*Note: IF any of the scope channels were labelled as anything other than an integer (ie. Pad: Sum 1) then the unpack procedure will fail. Simply edit all of the .dat files and change the Pad numbers to integers, the pad numbers are listed at the top of the .dat files and are easy to manipulate. Also keep in mind that they must all have the same field of width so keep the correct number of spaces after any and all integers.

Bias Curves and Residuals/Resolution Plots (~gmd/xray_analysis/micromegas_oct_04/)
The first method is using a bias curve analysis.  The data is analyzed one pad at a time.  For each pad, biascurveanal.f (which is the cleaned out program, called eventanal.f in the micromegas_oct_04 folders) is used.  The data files which hold the event information for each pad (one data file per position on the pad - at least 8 or 9 files per pad) should be in a file called data.txt

There are two hardcoded variables which need to be changed when running this program.  The first is the S variable, this indicates which pad is being bombarded with x-rays.  In the array of 6 pads, if the x-rays are pointed at the second pad, S=1.  The bias curve really only works when there is at least one pad on either side of it.  If data is only recorded from 6 pads, only 4 can be examined.  The second variable which needs to be modified for each pad is after the event data file is read in.  A value is subtracted from XMEASURED, this value should be the middle location of the pad in question (the location value given on the scope data acquisition program).  This is to centre the readings about the centre of the pad.

The program is first run with calibration set to '1'.  This runs through the first 400 events of each data file.  The position of each even is calculated using the center of gravity for the top 3 pad readings.  This calculated position is then checked with the actual position (which is stored in the data files and dealt/scaled properly in the program).  The differences are averaged and a bias curve can be formed for that particular pad.  The outputs are in xmea_xcalc_resid_reso_uncor.txt.  There is one line per data file used.  The outputs are the xmeasured, xcalculated, residual and resolution. 

When the program is run for the second time, calibration should be set to '0'.  The second 400 events from each data file are now used.  This time the uncorrected txt file is used (this is where the bias curve information is stored).  A spline interpolation is done with the on the data points (xmeasured vs. xcalculated) and this is used to determine the actual position.  Again the outputs are printed to a txt file (*_cor.txt), as well as an rzhist file.  This corrected rzhist file and text file have the final residuals and resolution plots for the analysis. 

To Run Bias analysis
-Must be run individually on each pad
From folder  gmd/xray_analysis/micromegas_oct_04/pad19

>>compile eventanal.f
>>eventanal.run
>>1  (calibrate)
>>eventanal.run
>>0  (analyze)

-> Open xmea_xcalc_resid_reso_cor.txt in a spreadsheet, columns in file are as described above.

Previous directions/info (might have something I left out from above)…
To create the bias curves and the residual/resolution plots, the program eventanal.f can be used. The most current version is located in ~gmd/xray_analysis/micromegas_oct_04/pad16/. The program is currently set up to read 400 events, create the bias curve information, then read the following 400 events and use that information to create the residuals/resolution information. The info is stored in uncorrected and corrected .txt and .rzhist files. The bias information can be found in the uncorrected .txt file (first two columns) and the res. info can be found in the corrected .txt file (3, 4th column I believe). Plots of the centroid vs. position can be found in the .rzhist files.

*Note: When reading the files in from data.txt, they must be in descending order (pad 18.0, 17.9, 17.8,...) for the spline to work properly.

The file is also set to ignore channels 1+5 for the data processing, and use them only to discern bad data. This can be changed fairly easily at the point in the program where the events are read in. Channel 1 is overwritten, and channel 5 is stored in array position 7. Thus with 8 channels read in, the Upper Bound is set to NSCOPE-2, so that it only uses channels 1-6. This was done because channels 1 and 5 were the Sum channels on the oscilloscope.

Pad Response Functions   (~gmd/xray_analysis/PRF/)
This method is still under development.  Ideally the first 400 events from all of the files will be used to create PRF(s).  The created PRF(s) are used either in discrete data points or by fitting a gaussian to the discrete points and using that function.  The second 400 events should then each be fitted to the PRF (either by interpolating between the data points or simply using the determined function).

This has been done before, although I am not sure how it was done, or in which files.  Alasdair has given some clues in Al_analysis.txt.  The files should be in the folder gmd/mmcalib/. 

The problem of different events having different charges is hoped to be found by normalizing all of the readings by dividing them by the total charge (sum) on all the pads. 

Some PRF histograms can be seen by running gmd/xray_analysis/PRF/totalcharge.f .  The plot.kumac plots the total charge (two plots, each for 400 events) and the 6 found prfs.  This program also examines prf differences between events with a large energy and low energy events.  diff.kumac plots 12 prfs, two for each pad.  The first 400 events are used to determine the average total energy, and the second 400 events are split into larger or smaller.  The histograms are then normalized to area 1, and plotted.

The most recent prf fitting program can be found in /gmd/xray_analysis/PRF/prfuse.f .  There have been many different versions of this program, however this is the most recent one.  Old data will be placed in /PRF/old/.  The main differences were how the peak of each event was found.  a) minimum value, b) value averaged around the peak c) re-binned and then minimum value. 

prfuse.f uses prfdata.txt as the pad response data (the curve that the data is fit to).  This is currently created manually using Excel (or open office spreadsheet), and the data found in prfanal.f.

ALL From the gmd/xray_analysis/PRF ---->

To create the prf

>>compile prfanal.f
>>prfanal.run
>>47 (data files)
>>800 (events)

-This creates prf.txt use this file and Excel, shift the columns down by 40,32,24,16,8,0 respectively, this puts all the PRFs ontop of each other.  Average the rows, into a column of 87 rows.  copy and paste this column into a text file, with the number 87 at the top, call it prfdata.txt.

To Run the Analysis:

>>compile prfuse.f
>>prfuse.run

This will perform the analysis using the prfdata.  To see the data

>>paw++
paw++>>h/file 1 prf.rzhist
paw++>>h/pl 900
paw++>>h/pl 901

 

Previous PRF info
To create the pad response functions, one may use ~gmd/xray_analysis/PRF/prfanal.f . This is a simplified version of eventanal.f, which simply reads in the events from all of the listed files, and records the average max at each position. Thus for each of the 6 pads we used, and for each of the 47 positions (250 microns apart) it created one point. They were then outputted to prf.txt in columns by padnumber. This file can be quickly imported into Excel and used to create the pad response functions. To place all of the prfs on top of each other, simply create an X-axis offset each pad by 2mm. This file was made slightly more user friendly by asking the user to input most of the variables. It is still however assumed that there are 2 sum channels, on chan 1 + 5. To make any other large changes one must alter the code itself.

That is all for data analysis, I hope it helps!

Last modified December 17th, 2004 by Dan Burke

Back

 
© 2006 Carleton University 1125 Colonel By Drive, Ottawa, Ontario, K1S 5B6 Canada (613) 520-7400
| Contacts |
Canada's Capital University