Skip to content

Commit 14c2709

Browse files
Merge pull request #133 from CosmoStat/develop
v0.1.2
2 parents 7c1d364 + 4996c8b commit 14c2709

35 files changed

+3265
-588
lines changed

.gitignore

+1
Original file line numberDiff line numberDiff line change
@@ -158,3 +158,4 @@ cosmo_inference/cosmosis_config/priors_*
158158
cosmo_inference/cosmosis_config/values_*
159159

160160
notebooks/tests/star_response.ipynb
161+
sp_validation.code-workspace

README.md

+27-137
Original file line numberDiff line numberDiff line change
@@ -11,153 +11,43 @@ Validation of weak-lensing catalogues (galaxy and star shapes and other paramete
1111
| [![coc](https://img.shields.io/badge/conduct-read-lightgrey)](https://github.com/martin.kilbinger/sp_validation/blob/master/CODE_OF_CONDUCT.md) | [![Updates](https://pyup.io/repos/github/martin.kilbinger/sp_validation/shield.svg)](https://pyup.io/repos/github/martin.kilbinger/sp_validation/) | |
1212

1313
---
14-
> Author: <a href="www.cosmostat.org/people/kilbinger" target="_blank" style="text-decoration:none; color: #F08080">Axel Guinot, Martin Kilbinger, Samuel Farrens, Emma Ayçoberry</a>
14+
> Authors: <a href="www.cosmostat.org" target="_blank" style="text-decoration:none; color: #F08080">CosmoStat</a> lab at CEA Paris-Saclay, including:
15+
Axel Guinot, Martin Kilbinger, Lucie Baumont, Sacha Guerrini, Fabian Hervas Peters, Samuel Farrens, Emma Ayçoberry.</a>
1516
> Email: <a href="mailto:[email protected]" style="text-decoration:none; color: #F08080">[email protected]</a>
16-
> Year: 2021
1717
---
1818

19-
See [pyraliddemo](https://github.com/sfarrens/pyraliddemo) for a demo package created with the Pyralid template.
19+
This package contains a library and several scripts and notebooks. The main
20+
tasks that can be performed by `sp_validation` are:
21+
- Shear validation, in particular for the output of the `shapepipe`
22+
pipeline. This task takes on input a shear catalogue with metacal information,
23+
performs the calibration and carries out various tests, e.g. PSF leakage.
24+
A calibrated shear catalogue is then created on output.
25+
- Post processing. A number of scripts allow further processing of the above
26+
output calibrated shear catalogue.
27+
- Cosmology validation. This task uses the calibrated shear catalogue from
28+
above to run detailed diagnostics useful for further cosmology analysis,
29+
e.g. rho- and tau-statistics, E-/B-mode decomposition. Several catalogues
30+
can be compared and useful plots are created.
31+
- Cosmology inference. This task uses the calibrated shear catalogue from
32+
a shear validation run and performes cosmology inference using the two-point
33+
correlation function.
2034

2135

22-
## Run validation
36+
## Run shear validation
2337

24-
### Set up
38+
See the [documentation](docs/source/run_validation.md) for instructions how to set up and run `sp_validation`.
2539

26-
Edit the file `notebooks/params.py` according to your data.
2740

28-
Make sure that all input files set in `params.py` are accessible from the run directory. These are
29-
the ASCII file containing the tile IDs (`path_tile_ID`), the FITS galaxy catalogue (`galaxy_cat_path`),
30-
and the FITS star catalogue (`star_cat_path`).
41+
## Post processing
3142

32-
The file `param.py` needs to be in the directory where the validation is run.
43+
The output(s) of one or more [shear validation runs](#run-shear-validation) can
44+
be processed further with post-processing scripts. See
45+
[here](docs/source/post_processing.md) for details.
3346

34-
### Run
47+
## Cosmology validation
3548

36-
There are two possibilities to carry out the validation, by running the jupyter notebooks
37-
in a browser, or by running a python script in the command line via `ipython`.
49+
TBD.
3850

39-
#### Running the jupyter notebooks
40-
41-
1. In the run directory start JupyterLab:
42-
```bash
43-
jupyer-lab
44-
```
45-
46-
2. Load and run first notebook `main_set_up.ipynb`:
47-
1. Double-click in file manager tab on the left.
48-
2. Run notebook.
49-
50-
3. Run all further notebooks:
51-
1. Double-click as above.
52-
2. Change kernel to `main_set_up.ipynb`:
53-
Either via the menu `Kernel -> Change Kernel` or
54-
by clicking on `Python 3` on the top left of the notebook.
55-
3. Run notebook.
56-
57-
Run the notebooks in the following order:
58-
1. `main_set_up.ipynb` (main notebook)
59-
2. `metacal_global.ipynb`
60-
3. [`metacal_local.ipynb`] optional
61-
4. `psf_leakage.ipynb`
62-
5. `write_cat.ipynb`
63-
6. `maps.ipynb`
64-
7. `cosmology.ipynb`
65-
66-
#### Running the python script
67-
68-
1. Create the python script from the jupyter notbooks. In `notebooks`:
69-
```bash
70-
python config_convert.py
71-
```
72-
73-
2. Run python script. In run directory:
74-
```bash
75-
ipython /path/to/sp_validation/notebooks/validation.py
76-
```
77-
78-
## Further post-processing
79-
80-
After the validation is run, further processing steps can be carried out using python scripts, as follows.
81-
82-
### Combine validation runs
83-
84-
Summary statistics created by validation runs of sub-areas of a survey can be combined to create joint summary statistics.
85-
This is useful in cases where the galaxy catalogue of an entire survey is too large to process, and needs to be broken
86-
down in smaller patches. This step provides global summary statistics from those patches.
87-
88-
Depending on the type of summary, their combination can be the sum (e.g. for number of objects), average, weighted average (e.g. for the additive bias),
89-
the weighted average of the square (e.g. the ellipticity dispersion), the weighted variance (to combine variance estimates), or the weighted variance of the mean
90-
(to combine mean variance estimates).
91-
92-
In a directory containing the subpatches as subdirectories, and within each their own `sp_output` results of the validation runs, type
93-
```bash
94-
/path/to/sp_validation/scripts/combine_results.py
95-
```
96-
This script creates a number of output files, including `R.txt` and `c.txt` with the combined multiplicative and additive biases, respectively.
97-
98-
### Create combined calibrated shear catalogue
99-
100-
After creating the combined results described above, the global calibration outputs can be used to create a combined, globally calibrated shear catalogue.
101-
The calibration is obtained from the files `R.txt` and `c.txt` created above.
102-
103-
In the same directory containing the subpatches as above, type
104-
```bash
105-
/path/to/sp_validation/scripts/create_joint_shape_cat.py
106-
```
107-
It creates the joint output catalogue `joint.fits`.
108-
109-
### PSF - galaxy object-wise ellipticity leakage
110-
111-
The validation notebooks, in particular `psf_leakage.ipynb`, compute the leakage from PSF to galaxy ellipticity, as part of the global validation.
112-
This can also be done with the stand-alone python script `scripts/leakage_object.py`.
113-
114-
For example, for a given patch, run
115-
```bash
116-
leakage_object.py -i sp_output/shape_catalog_extended_ngmix.fits -o leakage --hdu_psf 2 -v
117-
```
118-
to output plots and text files for the object-wise and scale-dependent leakage for that patch.
119-
120-
Leakage for the joint v1.0 catalogue can be computed via
121-
```bash
122-
leakage_object.py -i SP/unions_shapepipe_extended_2022_v1.0.fits -I SP/unions_shapepipe_psf_2022_v1.0.1.fits -o leakage -v
123-
```
124-
assuming `SP` is a link to the v1.0 ShapePipe data directory.
125-
If this call was done in a subdirectory from where in `..` are the patch runs, joint plots of the scale-dependent leakage can be produced by
126-
```bash
127-
plot_leakage.py leakage/alpha_leakage_ngmix.txt ../P[1234567]/leakage/alpha_leakage_ngmix.txt`
128-
```
129-
This will read in the text files produces by the previous calls of `leakage_object.py`.
130-
131-
This script fits a consistent linear or quadratic model of the galaxy
132-
ellipticity as function of PSF ellipticity. The best-fit parameters are written
133-
in an output `json` file.
134-
135-
A summary table of the object-wise leakage linear parameters will be created by
136-
`combine_results.py`, see above. This call will add the leakage from the joint
137-
catalogue, assuming the results are stored in `joint/leakage`.
138-
139-
### PSF - galaxy scale-dependent ellipticity leakage
140-
141-
Use the script `scripts/leakage_scale.py`.
142-
143-
### Correct for PSF - galaxy ellipticity leakage
144-
145-
Experimentally we can apply the correction alpha * eps_PSF to the galaxy
146-
elliptity.
147-
148-
First, `leakage_object.py` (see above) needs to be run.
149-
150-
Second, the best-fit result from that previous call of `leakage_object.py` is
151-
read, applied to the extended shear catalogue, and the corrected ellipticity is
152-
written in an output file.
153-
154-
```bash
155-
apply_alpha.py`
156-
```
157-
158-
Third, compute the shear correlation function and E-/B-mode statistics
159-
using `treecorr`, with the notebook (or ipython script)
160-
`notebooks/analt/xip_xim_v1_alpha_cor[.ipynb|.py]`.
161-
162-
Fourth, use the notebook `TBD` to plot the results.
51+
## Cosmology inference
16352

53+
See the corresponding [documentation](cosmo_inference/README.md).

cosmo_inference/cosmocov_config/cosmocov.ini

+19-9
Original file line numberDiff line numberDiff line change
@@ -16,22 +16,32 @@ h0 : 0.7
1616
# n_gal,lens_n_gal in gals/arcmin^2
1717

1818
; FOR SHAPEPIPE 3500
19-
; area : 3218.19
19+
; FOR SP_v1.3_LFmask_8k
20+
; area : 2138
2021
; sourcephotoz : multihisto
2122
; lensphotoz : multihisto
2223
; source_tomobins : 1
2324
; lens_tomobins : 1
24-
; sigma_e : 0.491712
25-
; source_n_gal : 8.42
25+
; sigma_e : 0.4384
26+
; source_n_gal : 7.6
2627

27-
; FOR SHAPEPIPE 1500
28-
area : 1453
28+
; FOR SP_v1.4_P1+3
29+
; area : 1133
30+
; sourcephotoz : multihisto
31+
; lensphotoz : multihisto
32+
; source_tomobins : 1
33+
; lens_tomobins : 1
34+
; sigma_e : 0.4384
35+
; source_n_gal : 7.67
36+
37+
; FOR SP_V0.0
38+
area : 1500
2939
sourcephotoz : multihisto
3040
lensphotoz : multihisto
3141
source_tomobins : 1
3242
lens_tomobins : 1
33-
sigma_e : 0.4808326112068524
34-
source_n_gal : 7.92
43+
sigma_e : 0.4243
44+
source_n_gal : 6.47
3545

3646
# IA parameters
3747
IA : 1
@@ -42,8 +52,8 @@ oneplusz0_ia: 0.6
4252
# Covariance paramters
4353
#
4454
# tmin,tmax in arcminutes
45-
tmin : 1
46-
tmax : 200
55+
tmin : 0.1
56+
tmax : 250
4757
ntheta : 20
4858

4959
# whether to calculate the non-Gaussian part as well

cosmo_inference/notebooks/MCMC.ipynb

+2-2
Original file line numberDiff line numberDiff line change
@@ -247,7 +247,7 @@
247247
],
248248
"metadata": {
249249
"kernelspec": {
250-
"display_name": "my_env",
250+
"display_name": "Python 3 (ipykernel)",
251251
"language": "python",
252252
"name": "python3"
253253
},
@@ -261,7 +261,7 @@
261261
"name": "python",
262262
"nbconvert_exporter": "python",
263263
"pygments_lexer": "ipython3",
264-
"version": "3.10.8"
264+
"version": "3.9.0"
265265
},
266266
"vscode": {
267267
"interpreter": {

cosmo_inference/scripts/cosmocov_process.py

+20-8
Original file line numberDiff line numberDiff line change
@@ -7,19 +7,30 @@
77
import matplotlib.pyplot as plt
88
import subprocess
99

10-
def calc_cov(root, nz_file):
10+
def calc_cov(root, nz_file, cosmocov_path):
1111
Path("data/%s/covs" %root).mkdir(parents=True, exist_ok=True)
1212
# write to the ini file, paths to the nz file
13-
with open('cosmocov_config/cosmocov_%s.ini' %root,'a') as f:
14-
f.write('shear_REDSHIFT_FILE : %s' %nz_file + '\n')
15-
f.write('clustering_REDSHIFT_FILE : %s' %nz_file + '\n')
16-
f.write('outdir : data/%s/covs/' %root + '\n')
17-
f.close()
13+
print("Creating cosmocov ini file for %s..." %root)
14+
if not os.path.exists('cosmocov_config/cosmocov_%s.ini' %root):
15+
with open('cosmocov_config/cosmocov_%s.ini' %root,'a') as f:
16+
with open('cosmocov_config/cosmocov.ini', 'r') as f2:
17+
f.write(f2.read())
18+
f.write('\n')
19+
f.write('shear_REDSHIFT_FILE : %s' %nz_file + '\n')
20+
f.write('clustering_REDSHIFT_FILE : %s' %nz_file + '\n')
21+
f.write('outdir : data/%s/covs/' %root + '\n')
22+
f.close()
23+
24+
else:
25+
print("Cosmocov ini file for %s already exists!" %root)
1826

1927
print("Running CosmoCov...\n")
2028

29+
if not os.path.exists('%s/CosmoCov/covs/cov' %cosmocov_path):
30+
raise Exception("CosmoCov executable not found! Please check the path to CosmoCov in the config file.")
31+
2132
results = subprocess.run('for i in {1..3}; do \
22-
CosmoCov/covs/cov $i cosmocov_config/cosmocov_%s.ini; done' %root,
33+
%s/CosmoCov/covs/cov $i cosmocov_config/cosmocov_%s.ini; done' %(cosmocov_path, root),
2334
capture_output = True,
2435
text = True,
2536
shell=True)
@@ -55,8 +66,9 @@ def convert_cov(filename):
5566

5667
root = sys.argv[1]
5768
nz_file = sys.argv[2]
69+
cosmocov_path = sys.argv[3]
5870

59-
calc_cov(root,nz_file)
71+
calc_cov(root,nz_file, cosmocov_path)
6072

6173
covfile = "data/%s/covs/cov_%s" %(root,root)
6274

cosmo_inference/scripts/cosmosis_fitting.py

+2-2
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@
77
import sys
88

99
#transforms treecorr fits file of correlation functions into CosmoSIS-friendly 2pt FITS extension to be read by 2pt_likelihood
10-
def treecorr_to_fits(xipm_file,root):
10+
def treecorr_to_fits(filename1, filename2, root):
1111

1212
xiplus_hdu = fits.open(filename1)
1313
ximinus_hdu = fits.open(filename2)
@@ -118,7 +118,7 @@ def rho_to_fits(filename):
118118

119119
#create the required FITS extensions
120120
print("Creating 2PT fits extension...\n")
121-
xip_hdu, xim_hdu = treecorr_to_fits(two_pt_file,root)
121+
xip_hdu, xim_hdu = treecorr_to_fits(two_pt_file_xip, two_pt_file_xim, root)
122122
print("Creating CovMat fits extension...\n")
123123
cov_hdu = covdat_to_fits(cov_file)
124124
print("Creating n(z) fits extension...\n")

cosmo_inference/scripts/xi_sys_psf.py

+22-14
Original file line numberDiff line numberDiff line change
@@ -6,6 +6,8 @@
66
#This file should be added to your cosmosis_standard_library following the path shear/xi_sys/xi_sys_psf.py
77
def setup(options):
88
filename = options.get_string(option_section, 'data_file')
9+
xi_p_name = options.get_string(option_section, 'xi_plus_name')
10+
xi_m_name = options.get_string(option_section, 'xi_minus_name')
911
data = fits.open(filename)
1012
rho_stats_name = options.get_string(option_section, 'rho_stats_name')
1113
samples_path = options.get_string(option_section,'samples')
@@ -16,31 +18,37 @@ def setup(options):
1618

1719
rho_stats = data[rho_stats_name].data
1820

19-
return mean, cov, rho_stats
21+
range_xi_p= options[option_section, 'angle_range_XI_PLUS_1_1']
22+
range_xi_m= options[option_section, 'angle_range_XI_MINUS_1_1']
23+
24+
mask_xi_p = (data[xi_p_name].data['ANG'] > range_xi_p[0]) & (data[xi_p_name].data['ANG'] < range_xi_p[1])
25+
mask_xi_m = (data[xi_m_name].data['ANG'] > range_xi_m[0]) & (data[xi_m_name].data['ANG'] < range_xi_m[1])
26+
27+
return mean, cov, rho_stats, mask_xi_p, mask_xi_m
2028

2129
def execute(block, config):
2230

23-
mean, cov, rho_stats = config
31+
mean, cov, rho_stats, mask_xi_p, mask_xi_m = config
2432

2533
alpha, beta, eta = np.random.multivariate_normal(mean, cov)
2634
block['xi_sys', 'alpha'], block['xi_sys', 'beta'], block['xi_sys', 'eta'] = alpha, beta, eta
2735

2836
xi_sys_p = (
29-
alpha**2*rho_stats["rho_0_p"]
30-
+ beta**2*rho_stats["rho_1_p"]
31-
+ eta**2*rho_stats["rho_3_p"]
32-
+ 2*alpha*beta*rho_stats["rho_2_p"]
33-
+ 2*beta*eta*rho_stats["rho_4_p"]
34-
+ 2*alpha*eta*rho_stats["rho_5_p"]
37+
alpha**2*rho_stats["rho_0_p"][mask_xi_p]
38+
+ beta**2*rho_stats["rho_1_p"][mask_xi_p]
39+
+ eta**2*rho_stats["rho_3_p"][mask_xi_p]
40+
+ 2*alpha*beta*rho_stats["rho_2_p"][mask_xi_p]
41+
+ 2*beta*eta*rho_stats["rho_4_p"][mask_xi_p]
42+
+ 2*alpha*eta*rho_stats["rho_5_p"][mask_xi_p]
3543
)
3644

3745
xi_sys_m = (
38-
alpha**2*rho_stats["rho_0_m"]
39-
+ beta**2*rho_stats["rho_1_m"]
40-
+ eta**2*rho_stats["rho_3_m"]
41-
+ 2*alpha*beta*rho_stats["rho_2_m"]
42-
+ 2*beta*eta*rho_stats["rho_4_m"]
43-
+ 2*alpha*eta*rho_stats["rho_5_m"]
46+
alpha**2*rho_stats["rho_0_m"][mask_xi_m]
47+
+ beta**2*rho_stats["rho_1_m"][mask_xi_m]
48+
+ eta**2*rho_stats["rho_3_m"][mask_xi_m]
49+
+ 2*alpha*beta*rho_stats["rho_2_m"][mask_xi_m]
50+
+ 2*beta*eta*rho_stats["rho_4_m"][mask_xi_m]
51+
+ 2*alpha*eta*rho_stats["rho_5_m"][mask_xi_m]
4452
)
4553

4654
block['xi_sys', 'xi_sys_vec'] = np.concatenate([xi_sys_p, xi_sys_m])

0 commit comments

Comments
 (0)