Skip to content

Commit 9688052

Browse files
installation instructions in README; minimal example
1 parent e33ea20 commit 9688052

File tree

2 files changed

+120
-6
lines changed

2 files changed

+120
-6
lines changed

README.md

Lines changed: 45 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,15 +1,54 @@
1-
## HMcode
1+
# HMcode
22

3-
To whom it may concern,
4-
5-
I coded this `Python` version of `HMcode-2020` ([Mead et al. 2021](https://arxiv.org/abs/2009.01858)) up quite quickly before leaving academia in February 2023. It is written in pure `Python` and doesn't use any of the original Fortran code whatsoever. There is something amazing/dispiriting about coding something up in 3 days that previously took 5 years. A tragic last hoorah! At least I switched to `Python` eventually...
3+
A pure-`Python` implementation of the `HMcode-2020` method ([Mead et al. 2021](https://arxiv.org/abs/2009.01858)) for computing accurate non-linear matter power spectra across a wide range of cosmological parameters for $w(a)$-CDM models including curvature and massive neutrinos.
64

7-
You might also be interested in [`pyhalomodel`](https://pypi.org/project/pyhalomodel/), upon which this code depends, which implements a vanilla halo-model calculation for any desired large-scale-structure tracer. Alternatively, and very confusingly, you might be interested in this [`pyhmcode`](https://pypi.org/project/pyhmcode/), which provides a wrapper around the original `Fortran` `HMcode` implementation.
5+
## Installation
86

9-
To install, clone the repository, `cd` into the directory, and then
7+
Either
8+
```
9+
pip install hmcode
10+
```
11+
or, if you want to edit the code, use the demo notebook, or run the tests or consistency checks, then clone the repository, `cd` into the directory, and then
1012
```
1113
poetry install
1214
```
15+
to create a virtual environment with everything you need to get going.
16+
17+
## Dependencies
18+
19+
- `numpy`
20+
- `scipy`
21+
- `camb`
22+
- `pyhalomodel`
23+
24+
## Use
25+
26+
```
27+
import numpy as np
28+
import camb
29+
import hmcode
30+
31+
# Ranges
32+
k = np.logspace(-3, 1, 100) # Wavenumbers [h/Mpc]
33+
zs = [3., 2., 1., 0.5, 0.] # Redshifts
34+
35+
# Run CAMB
36+
parameters = camb.CAMBparams(WantCls=False)
37+
parameters.set_cosmology(H0=70.)
38+
parameters.set_matter_power(redshifts=zs, kmax=100.) # kmax should be much larger than the wavenumber of interest
39+
results = camb.get_results(parameters)
40+
41+
# HMcode
42+
Pk = hmcode.power(k, zs, results)
43+
```
44+
45+
## Note
46+
47+
To whom it may concern,
48+
49+
I coded this `Python` version of `HMcode-2020` ([Mead et al. 2021](https://arxiv.org/abs/2009.01858)) up quite quickly before leaving academia in February 2023. It is written in pure `Python` and doesn't use any of the original Fortran code whatsoever. There is something amazing/dispiriting about coding something up in 3 days that previously took 5 years. A tragic last hoorah! At least I switched to `Python` eventually...
50+
51+
You might also be interested in [`pyhalomodel`](https://pypi.org/project/pyhalomodel/), upon which this code depends, which implements a vanilla halo-model calculation for any desired large-scale-structure tracer. Alternatively, and very confusingly, you might be interested in this [`pyhmcode`](https://pypi.org/project/pyhmcode/), which provides a wrapper around the original `Fortran` `HMcode` implementation.
1352

1453
I compared it against the `CAMB-HMcode` version for 100 random sets of cosmological parameters ($k < 10 h\mathrm{Mpc}^{-1}$; $z < 3$). The level of agreement between the two codes is as follows:
1554
- LCDM: Mean error: 0.10%; Std error: 0.03%; Worst error; 0.21%

notebooks/minimal.ipynb

Lines changed: 75 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,75 @@
1+
{
2+
"cells": [
3+
{
4+
"cell_type": "code",
5+
"execution_count": null,
6+
"metadata": {},
7+
"outputs": [],
8+
"source": [
9+
"# Standard imports\n",
10+
"import numpy as np\n",
11+
"\n",
12+
"# Third-party imports\n",
13+
"import camb\n",
14+
"import hmcode\n",
15+
"\n",
16+
"# Ranges\n",
17+
"k = np.logspace(-3, 1, 100) # Wavenumbers [h/Mpc]\n",
18+
"zs = [3., 2., 1., 0.5, 0.] # Redshifts\n",
19+
"\n",
20+
"# Run CAMB\n",
21+
"parameters = camb.CAMBparams(WantCls=False)\n",
22+
"parameters.set_cosmology(H0=70.)\n",
23+
"parameters.set_matter_power(redshifts=zs, kmax=100.) # kmax should be much larger than the wavenumber of interest\n",
24+
"results = camb.get_results(parameters)\n",
25+
"\n",
26+
"# HMcode\n",
27+
"Pk = hmcode.power(k, zs, results)"
28+
]
29+
},
30+
{
31+
"cell_type": "code",
32+
"execution_count": null,
33+
"metadata": {},
34+
"outputs": [],
35+
"source": [
36+
"import matplotlib.pyplot as plt\n",
37+
"\n",
38+
"# Plot\n",
39+
"for iz, z in enumerate(zs):\n",
40+
" plt.loglog(k, Pk[iz, :], label='z = {:1.1f}'.format(z))\n",
41+
"plt.xlabel('$k$ $[h \\mathrm{Mpc}^{-1}]$')\n",
42+
"plt.ylabel('$P(k)$ $[(h^{-1}\\mathrm{Mpc})^3]$')\n",
43+
"plt.legend()\n",
44+
"plt.show()"
45+
]
46+
}
47+
],
48+
"metadata": {
49+
"kernelspec": {
50+
"display_name": "hmcode-S0Z6NeuA-py3.10",
51+
"language": "python",
52+
"name": "python3"
53+
},
54+
"language_info": {
55+
"codemirror_mode": {
56+
"name": "ipython",
57+
"version": 3
58+
},
59+
"file_extension": ".py",
60+
"mimetype": "text/x-python",
61+
"name": "python",
62+
"nbconvert_exporter": "python",
63+
"pygments_lexer": "ipython3",
64+
"version": "3.10.9"
65+
},
66+
"orig_nbformat": 4,
67+
"vscode": {
68+
"interpreter": {
69+
"hash": "e82eed08d8d58f92fc7a23e50eeb23ca4d9777479ef0aea969fd18c18280722f"
70+
}
71+
}
72+
},
73+
"nbformat": 4,
74+
"nbformat_minor": 2
75+
}

0 commit comments

Comments
 (0)