Skip to content
Snippets Groups Projects

Compare revisions

Changes are shown as if the source revision was being merged into the target revision. Learn more about comparing revisions.

Source

Select target project
No results found

Target

Select target project
  • mirko.birbaumer/hslu-statistical-data-analysis-i
  • martin.kaufmann/hslu-statistical-data-analysis-i
  • martin.kaufmann/statistik-1-martin-kaufmann
  • yanic.sondi/hslu-statistical-data-analysis-i
  • simon.ebnoether/hslu-statistical-data-analysis-i
  • michael.durrer.01/hslu-statistical-data-analysis-i
  • michael.durrer.01/hslu-statistical-data-analysis-i-michael-durrer
  • simge.kast/hslu-statistical-data-analysis-i
  • delia.graf/hslu-statistical-data-analysis-i
  • silvan.roth/hslu-statistical-data-analysis-i
  • simge.kast/hslu-statistical-data-analysis-i-int-sys
  • samuel.huser/hslu-statistical-data-analysis-i
  • kim.bucher/hslu-statistical-data-analysis-i
  • alexander.bernhart/hslu-statistical-data-analysis-i
  • atdhe.gashi/hslu-statistical-data-analysis-i
  • niklaus.jakob/hslu-statistical-data-analysis-i
  • pascal.buehlmann/hslu-statistical-data-analysis-i-fork-1-1
  • tobias.palmese/hslu-statistical-data-analysis-i
  • martin.kaufmann/hslu-statistical-data-analysis-i-4-martin
  • niklaus.jakob/hslu-statistical-data-analysis-i-week4
  • dimitri.omlin/hslu-statistical-data-analysis-i
  • michael.durrer.01/hslu-statistical-data-analysis-i-michael
  • julia.duerler/hslu-statistical-data-analysis-i
  • simge.kast/int-sys-hslu-statistical-data-analysis-i-new
  • kim.bucher/hslu-statistical-data-analysis-i-sw4
  • jaysen.kratz/hslu-statistical-data-analysis-i-sw4
  • emanuel.lemma/hslu-statistical-data-analysis-i
  • marco.fuchs/hslu-statistical-data-analysis-i
  • niklaus.jakob/hslu-statistical-data-analysis-i-week-5
  • kim.bucher/hslu-statistical-data-analysis-i-sw5
  • michael.durrer.01/hslu-statistical-data-analysis-i-m
  • alex.bernhart/hslu-statistical-data-analysis-i
  • emanuel.lemma/hslu-statistical-data-analysis-i-sw05
  • emanuel.lemma/hslu-statistical-data-analysis-i-sw05-2
  • niklaus.jakob/hslu-statistical-data-analysis-i-week-6
  • kim.bucher/hslu-statistical-data-analysis-i-sw6
  • tobias.palmese/hslu-28-10
  • oliver.rey/hslu-statistical-data-analysis-i
  • tobias.palmese/hslu-statistical-data-analysis-ig
  • lukas.langenegger/hslu-in-sys-mein-ordner
  • niklaus.jakob/hslu-statistical-data-analysis-i-week-7
  • michael.durrer.01/hslu-statistical-data-analysis-sw7
  • kim.bucher/hslu-statistical-data-analysis-i-sw7
  • antonin.huesler/insys
  • luca.tresch/hslu-statistical-data-analysis-i-sw01-07
  • niklaus.jakob/hslu-statistical-data-analysis-i-week-8
  • kim.bucher/hslu-statistical-data-analysis-i-sw8
  • atdhe.gashi/hslu-statistical-data-analysis-i-2
  • tobias.palmese/hslu-statistic-sw1-sw8
  • matthias.harte/hslu-statistical-data-analysis-i
  • ableuler/hslu-statistical-data-analysis-i
  • martin.burri/hslu-statistical-data-analysis-i
  • yannick.stalder/hslu-statistical-data-analysis-i
  • dominik.vonallmen/hslu-statistical-data-analysis-i
  • oliver.rey/fork-sw9-hslu-statistical-data-analysis-i
  • ralph.harmath/hslu-statistical-data-analysis
  • silvan.roth/hslu-statistical-data-analysis-i-silvanroth
  • simge.kast/hslu-statistical-data-analysis-i-simge-kast
  • niklaus.jakob/hslu-statistical-data-analysis-i-sw9
  • kim.bucher/hslu-statistical-data-analysis-i-sw9
  • niklaus.jakob/hslu-statistical-data-analysis-i-sw10
  • kim.bucher/hslu-statistical-data-analysis-i-sw10
  • lars.lanz/hslu-statistical-data-analysis-i
  • julia.duerler/hslu-statistical-data-analysis-i-julia
  • akshay.thadathil/hslu-statistical-data-analysis-i-at
  • stefan.gajic/hslu-int-sys-sgm
  • sven.camenzind/hslu-statistical-data-analysis-i-von-simon
  • sven.camenzind/hslu-statistical-data-analysis-i-se
  • akshay.thadathil/hslu-statistical-data-analysis-i-se
  • bleron.bytyqi/insys
  • andreascolin.schuermann/hslu-statistical-data-analysis-i-andy
  • bleron.bytyqi/main
  • sven.camenzind/hslu-statistical-data-analysis-i-main-mep
  • jaysen.kratz/hslu-statistical-data-analysis-i-jaysen-kratz
  • silvan.roth/pr-fungsvorbereitung-insys
  • alex.bernhart/hslu-statistical-data-analysis-i-week-8
  • alex.bernhart/hslu-statistical-data-analysis-i-2
  • dimitri.omlin/hslu-statistical-data-analysis-i-week-8
  • marvin.herger/hslu-statistical-data-analysis-i-marvin-herger
  • lukas.koch/hslu-statistical-data-analysis-i-lukas-koch
  • delia.graf/hslu-statistical-data-analysis-i-graf
  • marvin.herger/hslu-statistical-data-analysis-i-m-h
  • karim.kaufmann/hslu-statistical-data-analysis-i-karim-kaufmann
  • simge.kast/hslu-statistical-data-analysis-i-updated
  • michael.durrer.01/hslu-statistical-data-analysis-i-m-d-31-01-22
  • marcowinkelmann/hslu-statistical-data-analysis-i
  • peter.buechel/hslu-statistical-data-analysis-i
  • janick.lang/hslu-statistical-data-analysis-i
  • julizoe/hslu-statistical-data-analysis-i
  • raphael.wolf/hslu-statistical-data-analysis-i
  • bjoern.oberli/hslu-statistical-data-analysis-i
  • joshua.wyss/hslu-statistical-data-analysis-i
  • joshua.wyss/stoc-fs2022
  • peter.buechel/hslu-statistical-data-analysis-i-p
  • bjoern.oberli/hslu-statistical-data-analysis-i-bob
  • mail6/hslu-statistical-data-analysis-i-markus-koetter
  • lino.thalmann/hslu-statistical-data-analysis-i
  • antonin.huesler/hslu-statistical-data-analysis-i
  • jorian.schlunegger/hslu-statistical-data-analysis-i
  • peter.buechel1/hslu-statistical-data-analysis-i
  • peter.buechel/hslu-statistical-data-analysis-i-stoc
  • pascal.baumann/hslu-statistical-data-analysis-i
  • antonin.huesler/stoch1-hs22
  • markus.c.mueller1998/hslu-statistical-data-analysis-i
  • salome.blum/kopie-ii-hslu-statistical-data-analysis-i
  • pascal.guerber/hslu-statistical-data-analysis-i
  • elmar.wueest/hslu-statistical-data-analysis-i
  • andreasming/hslu-statistical-data-analysis-i
  • leonard.bisaku/hslu-statistical-data-analysis-i-with-pymc
  • sibylle.sager/hslu-statistical-data-analysis-i
  • elio.schmid1/hslu-statistical-data-analysis
  • andreas.furger99/hslu-statistical-data-analysis-i
  • christopher.nguyen/hslu-statistical-data-analysis-i
  • richard.kaiser/hslu-statistical-data-analysis-i
  • yannick/hslu-statistical-data-analysis-i
  • lea.emmenegger/hslu-statistical-data-analysis-i
  • joris.gisler/hslu-statistical-data-analysis-i
  • hartmannjoel4/hslu-statistical-data-analysis-i
  • lea.emmenegger/statistik
  • salome.blum/hslu-statistical-data-analysis-i-copy
  • mirza.buzimkic/hslu-statistical-data-analysis-i
  • christian.stocker.01/hslu-statistical-data-analysis-i
  • petar.mladenov/hslu-statistical-data-analysis-i
  • noel.frei/hslu-statistical-data-analysis-i
  • nick.martig/hslu-statistical-data-analysis-i
  • nick.martig/hslu-statistical-data-analysis-1
  • raul.tripon/hslu-statistical-data-analysis-i
  • raul.tripon/hslu-statistical-data-analysis-i-raul-tripon
  • timo.schaffner/hslu-statistical-data-analysis
  • marc.baumann/hslu-statistical-data-analysis-i
130 results
Show changes
Commits on Source (2)
Showing
with 1464 additions and 47 deletions
.git/objects/pack/pack-e4d8eb24c8980fa83f56e5f1721afdf30e1511f0.pack filter=lfs diff=lfs merge=lfs -text
notebooks/Normal filter=lfs diff=lfs merge=lfs -text
and filter=lfs diff=lfs merge=lfs -text
t-Distribution/students.csv filter=lfs diff=lfs merge=lfs -text
.git/objects/pack/pack-503b0d3875a89d392191586b9f3094e805e9d648.pack filter=lfs diff=lfs merge=lfs -text
.git/objects/pack/pack-7d220d49f09763a29a52292682de638409708007.pack filter=lfs diff=lfs merge=lfs -text
.git/objects/pack/pack-78227f007957049b16a096f6f700f2945d6a0428.pack filter=lfs diff=lfs merge=lfs -text
notebooks/problems_demo.pdf filter=lfs diff=lfs merge=lfs -text
%% Cell type:code id:727a1945-aba9-4e43-8419-738f886f8f51 tags:
``` python
```
......@@ -73,7 +73,11 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
<<<<<<< HEAD
"version": "3.9.7"
=======
"version": "3.9.12"
>>>>>>> 74c736b4bd250d513a4c3e1aa0d9d4e67c1d31f3
}
},
"nbformat": 4,
......
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Bestimmung 95%-HDI für die Betaverteilung\n",
"\n",
"Die folgende Prozedur definiert, das HDI für die Beta-Verteilung:"
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {},
"outputs": [],
"source": [
"from scipy.stats import beta\n",
"import numpy as np\n",
"\n",
"def hdi(a,b, prob = 0.95):\n",
" k = 0\n",
" x = np.linspace(0,1,1000)\n",
" y = beta.pdf(x,a,b)\n",
" while True:\n",
" k = k+0.0001\n",
" if np.sum(y[y > k])/np.size(x) < prob:\n",
" break\n",
" return x[np.argwhere(y > k)][0] ,x[np.argwhere(y > k)][np.argwhere(y > k).size-1] "
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"In hdi(...) müssen nur noch die Parameter $a$ und $b$ eingegeben werden. Wenn die Prozentzahl noch geändert soll, fügen Sie ein zusätzliches prob=... ein. prob=0.5 würde dann bedeuten, dass 50% der glaubwürdigsten Parameter im HDI enthalten sind. "
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"(array([0.21221221]), array([0.78778779]))"
]
},
"execution_count": 3,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"hdi(5,5) "
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
<<<<<<< HEAD
"version": "3.9.7"
=======
"version": "3.9.12"
>>>>>>> 74c736b4bd250d513a4c3e1aa0d9d4e67c1d31f3
}
},
"nbformat": 4,
"nbformat_minor": 4
}
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Bestimmung 95%-HDI für die Betaverteilung\n",
"\n",
"Die folgende Prozedur definiert, das HDI für die Beta-Verteilung:"
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {},
"outputs": [],
"source": [
"from scipy.stats import beta\n",
"import numpy as np\n",
"\n",
"def hdi(a,b, prob = 0.95):\n",
" k = 0\n",
" x = np.linspace(0,1,1000)\n",
" y = beta.pdf(x,a,b)\n",
" while True:\n",
" k = k+0.0001\n",
" if np.sum(y[y > k])/np.size(x) < prob:\n",
" break\n",
" return x[np.argwhere(y > k)][0] ,x[np.argwhere(y > k)][np.argwhere(y > k).size-1] "
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"In hdi(...) müssen nur noch die Parameter $a$ und $b$ eingegeben werden. Wenn die Prozentzahl noch geändert soll, fügen Sie ein zusätzliches prob=... ein. prob=0.5 würde dann bedeuten, dass 50% der glaubwürdigsten Parameter im HDI enthalten sind. "
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"(array([0.21221221]), array([0.78778779]))"
]
},
"execution_count": 3,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"hdi(5,5) "
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
<<<<<<< HEAD
"version": "3.9.7"
=======
"version": "3.9.12"
>>>>>>> 74c736b4bd250d513a4c3e1aa0d9d4e67c1d31f3
}
},
"nbformat": 4,
"nbformat_minor": 4
}
%% Cell type:markdown id: tags:
# Bestimmung 95%-HDI für die Betaverteilung
Die folgende Prozedur definiert, das HDI für die Beta-Verteilung:
%% Cell type:code id: tags:
``` python
from scipy.stats import beta
import numpy as np
def hdi(a,b, prob = 0.95):
k = 0
x = np.linspace(0,1,1000)
y = beta.pdf(x,a,b)
while True:
k = k+0.0001
if np.sum(y[y > k])/np.size(x) < prob:
break
return x[np.argwhere(y > k)][0] ,x[np.argwhere(y > k)][np.argwhere(y > k).size-1]
```
%% Cell type:markdown id: tags:
In hdi(...) müssen nur noch die Parameter $a$ und $b$ eingegeben werden. Wenn die Prozentzahl noch geändert soll, fügen Sie ein zusätzliches prob=... ein. prob=0.5 würde dann bedeuten, dass 50% der glaubwürdigsten Parameter im HDI enthalten sind.
%% Cell type:code id: tags:
``` python
hdi(5,5)
```
%% Output
(array([0.21201212]), array([0.78798788]))
%% Cell type:markdown id: tags:
# Bestimmung 95%-HDI für die Betaverteilung
Die folgende Prozedur definiert, das HDI für die Beta-Verteilung:
%% Cell type:code id: tags:
``` python
from scipy.stats import beta
import numpy as np
def hdi(a,b, prob = 0.95):
k = 0
x = np.linspace(0,1,1000)
y = beta.pdf(x,a,b)
while True:
k = k+0.0001
if np.sum(y[y > k])/np.size(x) < prob:
break
return x[np.argwhere(y > k)][0] ,x[np.argwhere(y > k)][np.argwhere(y > k).size-1]
```
%% Cell type:markdown id: tags:
In hdi(...) müssen nur noch die Parameter $a$ und $b$ eingegeben werden. Wenn die Prozentzahl noch geändert soll, fügen Sie ein zusätzliches prob=... ein. prob=0.5 würde dann bedeuten, dass 50% der glaubwürdigsten Parameter im HDI enthalten sind.
%% Cell type:code id: tags:
``` python
hdi(5,5)
```
%% Output
(array([0.21201212]), array([0.78798788]))
%% Cell type:markdown id: tags:
# Bestimmung 95%-HDI für die Betaverteilung
Die folgende Prozedur definiert, das HDI für die Beta-Verteilung:
%% Cell type:code id: tags:
``` python
from scipy.stats import beta
import numpy as np
def hdi(a,b, prob = 0.95):
k = 0
x = np.linspace(0,1,1000)
y = beta.pdf(x,a,b)
while True:
k = k+0.0001
if np.sum(y[y > k])/np.size(x) < prob:
break
return x[np.argwhere(y > k)][0] ,x[np.argwhere(y > k)][np.argwhere(y > k).size-1]
```
%% Cell type:markdown id: tags:
In hdi(...) müssen nur noch die Parameter $a$ und $b$ eingegeben werden. Wenn die Prozentzahl noch geändert soll, fügen Sie ein zusätzliches prob=... ein. prob=0.5 würde dann bedeuten, dass 50% der glaubwürdigsten Parameter im HDI enthalten sind.
%% Cell type:code id: tags:
``` python
hdi(5,5)
```
%% Output
(array([0.21201212]), array([0.78798788]))
%% Cell type:markdown id: tags:
# Bestimmung 95%-HDI für die Betaverteilung
Die folgende Prozedur definiert, das HDI für die Beta-Verteilung:
%% Cell type:code id: tags:
``` python
from scipy.stats import beta
import numpy as np
def hdi(a,b, prob = 0.95):
k = 0
x = np.linspace(0,1,1000)
y = beta.pdf(x,a,b)
while True:
k = k+0.0001
if np.sum(y[y > k])/np.size(x) < prob:
break
return x[np.argwhere(y > k)][0] ,x[np.argwhere(y > k)][np.argwhere(y > k).size-1]
```
%% Cell type:markdown id: tags:
In hdi(...) müssen nur noch die Parameter $a$ und $b$ eingegeben werden. Wenn die Prozentzahl noch geändert soll, fügen Sie ein zusätzliches prob=... ein. prob=0.5 würde dann bedeuten, dass 50% der glaubwürdigsten Parameter im HDI enthalten sind.
%% Cell type:code id: tags:
``` python
hdi(5,5)
```
%% Output
(array([0.21201212]), array([0.78798788]))
%% Cell type:markdown id: tags:
# Bestimmung 95%-HDI für die Betaverteilung
Die folgende Prozedur definiert, das HDI für die Beta-Verteilung:
%% Cell type:code id: tags:
``` python
from scipy.stats import beta
import numpy as np
def hdi(a,b, prob = 0.95):
k = 0
x = np.linspace(0,1,1000)
y = beta.pdf(x,a,b)
while True:
k = k+0.0001
if np.sum(y[y > k])/np.size(x) < prob:
break
return x[np.argwhere(y > k)][0] ,x[np.argwhere(y > k)][np.argwhere(y > k).size-1]
```
%% Cell type:markdown id: tags:
In hdi(...) müssen nur noch die Parameter $a$ und $b$ eingegeben werden. Wenn die Prozentzahl noch geändert soll, fügen Sie ein zusätzliches prob=... ein. prob=0.5 würde dann bedeuten, dass 50% der glaubwürdigsten Parameter im HDI enthalten sind.
%% Cell type:code id: tags:
``` python
hdi(5,5)
```
%% Output
(array([0.21221221]), array([0.78778779]))
%% Cell type:markdown id: tags:
# Bestimmung 95%-HDI für die Betaverteilung
Die folgende Prozedur definiert, das HDI für die Beta-Verteilung:
%% Cell type:code id: tags:
``` python
from scipy.stats import beta
import numpy as np
def hdi(a,b, prob = 0.95):
k = 0
x = np.linspace(0,1,1000)
y = beta.pdf(x,a,b)
while True:
k = k+0.0001
if np.sum(y[y > k])/np.size(x) < prob:
break
return x[np.argwhere(y > k)][0] ,x[np.argwhere(y > k)][np.argwhere(y > k).size-1]
```
%% Cell type:markdown id: tags:
In hdi(...) müssen nur noch die Parameter $a$ und $b$ eingegeben werden. Wenn die Prozentzahl noch geändert soll, fügen Sie ein zusätzliches prob=... ein. prob=0.5 würde dann bedeuten, dass 50% der glaubwürdigsten Parameter im HDI enthalten sind.
%% Cell type:code id: tags:
``` python
hdi(5,5)
```
%% Output
(array([0.21221221]), array([0.78778779]))
"mann.jung","mann.alt"
135,294
222,311
251,286
260,264
269,277
235,336
386,208
252,346
352,239
173,172
156,254
"x"
"1" 22
"2" 35
"3" 27
"4" 54
"5" 2
"6" 20
"7" 39
"8" 14
"9" 2
"10" 31
"11" 35
"12" 8
"13" 26
"14" 19
"15" 23
"16" 40
"17" 66
"18" 28
"19" 42
"20" 21
"21" 18
"22" 40
"23" 27
"24" 30
"25" 20
"26" 16
"27" 18
"28" 7
"29" 21
"30" 65
"31" 28.5
"32" 11
"33" 22
"34" 45
"35" 4
"36" 64
"37" 19
"38" 26
"39" 32
"40" 16
"41" 21
"42" 26
"43" 25
"44" 23
"45" 28
"46" 22
"47" 28
"48" 16
"49" 20
"50" 24
"51" 29
"52" 20
"53" 46
"54" 26
"55" 59
"56" 22
"57" 71
"58" 34
"59" 28
"60" 29
"61" 21
"62" 33
"63" 37
"64" 28
"65" 38
"66" 47
"67" 14.5
"68" 22
"69" 20
"70" 17
"71" 21
"72" 70.5
"73" 29
"74" 24
"75" 2
"76" 21
"77" 19
"78" 32.5
"79" 54
"80" 19
"81" 45
"82" 33
"83" 20
"84" 47
"85" 25
"86" 23
"87" 37
"88" 16
"89" 24
"90" 40
"91" 19
"92" 18
"93" 19
"94" 9
"95" 36.5
"96" 42
"97" 51
"98" 55.5
"99" 40.5
"100" 27
"101" 51
"102" 30
"103" 37
"104" 5
"105" 44
"106" 26
"107" 17
"108" 1
"109" 45
"110" 60
"111" 28
"112" 61
"113" 4
"114" 21
"115" 56
"116" 18
"117" 5
"118" 50
"119" 30
"120" 36
"121" 8
"122" 39
"123" 9
"124" 39
"125" 40
"126" 36
"127" 19
"128" 28
"129" 42
"130" 24
"131" 28
"132" 17
"133" 34
"134" 45.5
"135" 2
"136" 32
"137" 24
"138" 22
"139" 30
"140" 22
"141" 42
"142" 30
"143" 27
"144" 51
"145" 22
"146" 22
"147" 20.5
"148" 18
"149" 12
"150" 29
"151" 59
"152" 24
"153" 21
"154" 44
"155" 19
"156" 33
"157" 19
"158" 29
"159" 22
"160" 30
"161" 44
"162" 25
"163" 54
"164" 18
"165" 29
"166" 62
"167" 30
"168" 41
"169" 52
"170" 40
"171" 21
"172" 36
"173" 16
"174" 28
"175" 37
"176" 45
"177" 21
"178" 7
"179" 65
"180" 28
"181" 16
"182" 57
"183" 33
"184" 22
"185" 36
"186" 24
"187" 24
"188" 30
"189" 23.5
"190" 2
"191" 19
"192" 28
"193" 30
"194" 26
"195" 28
"196" 43
"197" 54
"198" 22
"199" 27
"200" 20
"201" 61
"202" 45.5
"203" 38
"204" 16
"205" 30
"206" 29
"207" 45
"208" 28
"209" 25
"210" 36
"211" 42
"212" 23
"213" 43
"214" 15
"215" 25
"216" 23
"217" 28
"218" 38
"219" 40
"220" 29
"221" 45
"222" 35
"223" 27
"224" 30
"225" 18
"226" 19
"227" 22
"228" 3
"229" 27
"230" 20
"231" 19
"232" 32
"233" 27
"234" 18
"235" 1
"236" 19
"237" 28
"238" 22
"239" 31
"240" 46
"241" 23
"242" 26
"243" 21
"244" 28
"245" 20
"246" 34
"247" 51
"248" 21
"249" 3
"250" 42
"251" 27
"252" 22
"253" 32
"254" 30
"255" 10
"256" 21
"257" 29
"258" 28
"259" 18
"260" 54
"261" 28
"262" 17
"263" 50
"264" 21
"265" 64
"266" 31
"267" 20
"268" 25
"269" 36
"270" 28
"271" 30
"272" 24
"273" 65
"274" 17
"275" 34
"276" 47
"277" 48
"278" 34
"279" 38
"280" 21
"281" 56
"282" 22
"283" 39
"284" 38
"285" 22
"286" 40
"287" 34
"288" 29
"289" 22
"290" 9
"291" 37
"292" 50
"293" 8
"294" 58
"295" 30
"296" 19
"297" 21
"298" 55
"299" 71
"300" 21
"301" 26
"302" 55
"303" 25
"304" 24
"305" 17
"306" 21
"307" 21
"308" 37
"309" 18
"310" 28
"311" 66
"312" 24
"313" 47
"314" 30
"315" 32
"316" 22
"317" 35
"318" 18
"319" 40.5
"320" 49
"321" 39
"322" 23
"323" 17
"324" 17
"325" 30
"326" 45
"327" 69
"328" 9
"329" 11
"330" 50
"331" 64
"332" 33
"333" 27
"334" 21
"335" 62
"336" 45
"337" 30
"338" 40
"339" 28
"340" 40
"341" 62
"342" 24
"343" 19
"344" 29
"345" 28
"346" 16
"347" 19
"348" 18
"349" 54
"350" 36
"351" 16
"352" 47
"353" 22
"354" 22
"355" 35
"356" 47
"357" 40
"358" 37
"359" 36
"360" 49
"361" 18
"362" 42
"363" 37
"364" 44
"365" 36
"366" 30
"367" 39
"368" 21
"369" 22
"370" 35
"371" 34
"372" 26
"373" 26
"374" 27
"375" 21
"376" 21
"377" 61
"378" 57
"379" 26
"380" 18
"381" 51
"382" 30
"383" 9
"384" 32
"385" 31
"386" 41
"387" 37
"388" 20
"389" 2
"390" 19
"391" 21
"392" 23
"393" 21
"394" 18
"395" 24
"396" 27
"397" 32
"398" 23
"399" 58
"400" 40
"401" 47
"402" 36
"403" 32
"404" 25
"405" 49
"406" 43
"407" 31
"408" 70
"409" 19
"410" 18
"411" 24.5
"412" 43
"413" 28
"414" 20
"415" 14
"416" 60
"417" 25
"418" 14
"419" 19
"420" 18
"421" 25
"422" 60
"423" 52
"424" 44
"425" 49
"426" 42
"427" 18
"428" 25
"429" 26
"430" 39
"431" 41
"432" 29
"433" 52
"434" 19
"435" 33
"436" 17
"437" 34
"438" 50
"439" 20
"440" 25
"441" 25
"442" 11
"443" 41
"444" 23
"445" 23
"446" 28.5
"447" 48
"448" 20
"449" 32
"450" 36
"451" 24
"452" 70
"453" 16
"454" 19
"455" 31
"456" 33
"457" 23
"458" 28
"459" 18
"460" 34
"461" 23
"462" 41
"463" 16
"464" 46
"465" 30.5
"466" 28
"467" 32
"468" 24
"469" 48
"470" 57
"471" 29
"472" 18
"473" 20
"474" 22
"475" 29
"476" 35
"477" 25
"478" 25
"479" 8
"480" 46
"481" 20
"482" 16
"483" 21
"484" 43
"485" 25
"486" 39
"487" 30
"488" 30
"489" 34
"490" 31
"491" 39
"492" 18
"493" 39
"494" 26
"495" 39
"496" 35
"497" 6
"498" 30.5
"499" 39
"500" 23
"501" 31
"502" 43
"503" 10
"504" 38
"505" 2
"506" 36
"507" 23
"508" 30
"509" 23
"510" 18
"511" 21
"512" 20
"513" 20
"514" 16
"515" 34.5
"516" 17
"517" 42
"518" 18
"519" 35
"520" 28
"521" 4
"522" 74
"523" 9
"524" 44
"525" 30
"526" 41
"527" 21
"528" 14
"529" 24
"530" 31
"531" 23
"532" 26
"533" 33
"534" 47
"535" 20
"536" 19
"537" 23
"538" 33
"539" 22
"540" 28
"541" 25
"542" 39
"543" 27
"544" 7
"545" 32
This diff is collapsed.
......@@ -59,7 +59,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.9.6"
"version": "3.9.7"
}
},
"nbformat": 4,
......
notebooks/MCMC/output.png

97 KiB

"","GrowingArea","HoldingTemp","Size","StoragePeriod","CookingMethod","TextureScore","FlavorScore","MoistnessScore"
"1",1,1,"Large",1,"Boil",2.9,3.2,3
"2",1,1,"Large",1,"Steam",2.3,2.5,2.6
"3",1,1,"Large",1,"Mash",2.5,2.8,2.8
"4",1,1,"Large",1,"Bake.350",2.1,2.9,2.4
"5",1,1,"Large",1,"Bake.450",1.9,2.8,2.2
"6",1,1,"Large",2,"Boil",1.8,3,1.7
"7",1,1,"Large",2,"Steam",2.6,3.1,2.4
"8",1,1,"Large",2,"Mash",3,3,2.9
"9",1,1,"Large",2,"Bake.350",2.2,3.2,2.5
"10",1,1,"Large",2,"Bake.450",2,2.8,1.9
"11",1,1,"Large",3,"Boil",1.8,2.6,1.5
"12",1,1,"Large",3,"Steam",2,2.8,1.9
"13",1,1,"Large",3,"Mash",2.6,2.6,2.6
"14",1,1,"Large",3,"Bake.350",2.1,3.2,2.1
"15",1,1,"Large",3,"Bake.450",2.5,3,2.1
"16",1,1,"Large",4,"Boil",2.6,3.1,2.4
"17",1,1,"Large",4,"Steam",2.7,2.9,2.4
"18",1,1,"Large",4,"Mash",2.2,3.1,2.3
"19",1,1,"Large",4,"Bake.350",3.1,3.4,2.7
"20",1,1,"Large",4,"Bake.450",3,2.6,2.7
"21",1,1,"Medium",1,"Boil",3.1,3,2.8
"22",1,1,"Medium",1,"Steam",2.7,2.8,2.7
"23",1,1,"Medium",1,"Mash",2.4,3,2.9
"24",1,1,"Medium",1,"Bake.350",2.2,2.9,2.3
"25",1,1,"Medium",1,"Bake.450",1.9,2.9,2
"26",1,1,"Medium",2,"Boil",1.8,2.6,1.8
"27",1,1,"Medium",2,"Steam",2.2,2.9,2.1
"28",1,1,"Medium",2,"Mash",2.8,3.2,2.8
"29",1,1,"Medium",2,"Bake.350",2.3,3.2,2.4
"30",1,1,"Medium",2,"Bake.450",2,3,2
"31",1,1,"Medium",3,"Boil",1.9,3,1.8
"32",1,1,"Medium",3,"Steam",1.8,2.7,1.8
"33",1,1,"Medium",3,"Mash",3.3,3.2,3.2
"34",1,1,"Medium",3,"Bake.350",2.5,3.1,2.2
"35",1,1,"Medium",3,"Bake.450",2.5,3.4,2.3
"36",1,1,"Medium",4,"Boil",1.5,2.6,1.3
"37",1,1,"Medium",4,"Steam",1.4,2.6,1.3
"38",1,1,"Medium",4,"Mash",2.1,2.5,2
"39",1,1,"Medium",4,"Bake.350",1.8,3.1,1.7
"40",1,1,"Medium",4,"Bake.450",1.7,2.7,1.7
"41",1,2,"Large",1,"Boil",2.8,2.6,3
"42",1,2,"Large",1,"Steam",2.5,2.4,2.8
"43",1,2,"Large",1,"Mash",3.2,2.7,3.2
"44",1,2,"Large",1,"Bake.350",2.4,2.4,2.6
"45",1,2,"Large",1,"Bake.450",2,2.5,2.2
"46",1,2,"Large",2,"Boil",2.3,2.9,1.9
"47",1,2,"Large",2,"Steam",2.8,2.7,2.5
"48",1,2,"Large",2,"Mash",3.7,3.3,3.1
"49",1,2,"Large",2,"Bake.350",2.8,2.7,2.5
"50",1,2,"Large",2,"Bake.450",2.6,2.6,2.3
"51",1,2,"Large",3,"Boil",2.4,2.7,2
"52",1,2,"Large",3,"Steam",2.7,2.5,2.1
"53",1,2,"Large",3,"Mash",2.7,2.9,2.7
"54",1,2,"Large",3,"Bake.350",2.6,2.8,2.2
"55",1,2,"Large",3,"Bake.450",2.6,2.8,1.8
"56",1,2,"Large",4,"Boil",3,3.2,2.4
"57",1,2,"Large",4,"Steam",3.1,3.1,2.8
"58",1,2,"Large",4,"Mash",3.6,3,3.3
"59",1,2,"Large",4,"Bake.350",3.4,3.4,2.9
"60",1,2,"Large",4,"Bake.450",2.7,3,2.6
"61",1,2,"Medium",1,"Boil",2.2,2.6,2.6
"62",1,2,"Medium",1,"Steam",2.3,2.3,2.4
"63",1,2,"Medium",1,"Mash",2.7,2.6,2.7
"64",1,2,"Medium",1,"Bake.350",2,2.5,2
"65",1,2,"Medium",1,"Bake.450",1.4,2.4,1.7
"66",1,2,"Medium",2,"Boil",2.5,2.6,1.9
"67",1,2,"Medium",2,"Steam",3.2,2.9,2.9
"68",1,2,"Medium",2,"Mash",3,3.1,1.9
"69",1,2,"Medium",2,"Bake.350",2.6,2.8,2.7
"70",1,2,"Medium",2,"Bake.450",2.6,3.1,2.2
"71",1,2,"Medium",3,"Boil",2.4,3,2.2
"72",1,2,"Medium",3,"Steam",2.8,2.8,2.5
"73",1,2,"Medium",3,"Mash",3.3,3.1,3.1
"74",1,2,"Medium",3,"Bake.350",2.8,3.1,2.6
"75",1,2,"Medium",3,"Bake.450",2.9,2.9,2.4
"76",1,2,"Medium",4,"Boil",1.4,2.9,1.4
"77",1,2,"Medium",4,"Steam",2.1,2.5,1.6
"78",1,2,"Medium",4,"Mash",2.3,2.6,1.8
"79",1,2,"Medium",4,"Bake.350",1.8,2.9,1.6
"80",1,2,"Medium",4,"Bake.450",1.5,2.4,1.5
"81",2,1,"Large",1,"Boil",2.5,2.7,2.6
"82",2,1,"Large",1,"Steam",2.8,2.9,2.7
"83",2,1,"Large",1,"Mash",2.2,3,3
"84",2,1,"Large",1,"Bake.350",2.5,3.1,2.4
"85",2,1,"Large",1,"Bake.450",2.7,3,2.3
"86",2,1,"Large",2,"Boil",2.7,2.8,2.4
"87",2,1,"Large",2,"Steam",2.5,2.9,2.6
"88",2,1,"Large",2,"Mash",1.6,3.1,1.8
"89",2,1,"Large",2,"Bake.350",2.5,3.2,2.3
"90",2,1,"Large",2,"Bake.450",2.5,2.9,2.5
"91",2,1,"Large",3,"Boil",2.2,2.8,2.3
"92",2,1,"Large",3,"Steam",2.4,2.9,2.1
"93",2,1,"Large",3,"Mash",2.2,3.1,2.3
"94",2,1,"Large",3,"Bake.350",3.1,3.1,2.6
"95",2,1,"Large",3,"Bake.450",2.9,3.3,2.8
"96",2,1,"Large",4,"Boil",2.4,3.4,2.4
"97",2,1,"Large",4,"Steam",3.1,3.1,2.7
"98",2,1,"Large",4,"Mash",2.3,3.2,2.5
"99",2,1,"Large",4,"Bake.350",3.2,3.5,3.1
"100",2,1,"Large",4,"Bake.450",2.9,2.7,2.7
"101",2,1,"Medium",1,"Boil",2.6,3.3,2.6
"102",2,1,"Medium",1,"Steam",2.7,3,2.7
"103",2,1,"Medium",1,"Mash",2.5,2.9,2.7
"104",2,1,"Medium",1,"Bake.350",2.4,3,2.5
"105",2,1,"Medium",1,"Bake.450",2,2.9,2.1
"106",2,1,"Medium",2,"Boil",2,3,1.9
"107",2,1,"Medium",2,"Steam",2.3,3.1,2.3
"108",2,1,"Medium",2,"Mash",1.7,3.1,2.4
"109",2,1,"Medium",2,"Bake.350",2.6,3.1,2.5
"110",2,1,"Medium",2,"Bake.450",2.2,2.9,2.1
"111",2,1,"Medium",3,"Boil",1.7,3.2,1.5
"112",2,1,"Medium",3,"Steam",2.2,3.2,2
"113",2,1,"Medium",3,"Mash",1.7,3.1,2
"114",2,1,"Medium",3,"Bake.350",2.8,3.2,2.7
"115",2,1,"Medium",3,"Bake.450",2.6,3.3,2.6
"116",2,1,"Medium",4,"Boil",2,3.5,2.2
"117",2,1,"Medium",4,"Steam",1.8,3,2
"118",2,1,"Medium",4,"Mash",1.6,3.4,2.1
"119",2,1,"Medium",4,"Bake.350",2.8,3.3,2.6
"120",2,1,"Medium",4,"Bake.450",2.7,2.3,2.6
"121",2,2,"Large",1,"Boil",2.8,2.6,2.5
"122",2,2,"Large",1,"Steam",2.9,2,2.7
"123",2,2,"Large",1,"Mash",3,2.7,2.9
"124",2,2,"Large",1,"Bake.350",2.6,3,2.5
"125",2,2,"Large",1,"Bake.450",2.8,2.2,2.6
"126",2,2,"Large",2,"Boil",3.4,3.2,2.8
"127",2,2,"Large",2,"Steam",3.5,2.9,3
"128",2,2,"Large",2,"Mash",2.6,2.8,2.5
"129",2,2,"Large",2,"Bake.350",3.3,3,3.1
"130",2,2,"Large",2,"Bake.450",2,2.8,2.5
"131",2,2,"Large",3,"Boil",2.8,2.8,2.6
"132",2,2,"Large",3,"Steam",3.5,2.8,3
"133",2,2,"Large",3,"Mash",2.5,3.2,2.3
"134",2,2,"Large",3,"Bake.350",3.3,3,2.7
"135",2,2,"Large",3,"Bake.450",3.5,2.9,2.9
"136",2,2,"Large",4,"Boil",3.2,3.4,2.5
"137",2,2,"Large",4,"Steam",3.3,2.8,2.8
"138",2,2,"Large",4,"Mash",3,3,2.8
"139",2,2,"Large",4,"Bake.350",3.5,3.2,3.1
"140",2,2,"Large",4,"Bake.450",3.4,3,2.8
"141",2,2,"Medium",1,"Boil",2.7,2.5,2.5
"142",2,2,"Medium",1,"Steam",2.5,2.7,2.3
"143",2,2,"Medium",1,"Mash",3.2,2.7,3
"144",2,2,"Medium",1,"Bake.350",2.4,2.7,2.5
"145",2,2,"Medium",1,"Bake.450",2.7,2.1,2.3
"146",2,2,"Medium",2,"Boil",2.2,2.7,2.3
"147",2,2,"Medium",2,"Steam",3.1,2.9,2.6
"148",2,2,"Medium",2,"Mash",2.2,2.8,3.1
"149",2,2,"Medium",2,"Bake.350",2.9,3,2.7
"150",2,2,"Medium",2,"Bake.450",2.8,2.7,2.6
"151",2,2,"Medium",3,"Boil",2.5,3.2,2.3
"152",2,2,"Medium",3,"Steam",2.9,3.3,2.7
"153",2,2,"Medium",3,"Mash",2.5,3.1,2.5
"154",2,2,"Medium",3,"Bake.350",3,2.9,2.5
"155",2,2,"Medium",3,"Bake.450",2.9,3.1,3.1
"156",2,2,"Medium",4,"Boil",2.7,3.3,2.6
"157",2,2,"Medium",4,"Steam",2.6,2.8,2.3
"158",2,2,"Medium",4,"Mash",2.5,3.1,2.6
"159",2,2,"Medium",4,"Bake.350",3.4,3.3,3
"160",2,2,"Medium",4,"Bake.450",2.5,2.8,2.3
%% Cell type:code id: tags:
``` python
%matplotlib inline
import pymc3 as pm
import matplotlib.pyplot as plt
import scipy.stats as st
import arviz as az
import numpy as np
import warnings
warnings.filterwarnings("ignore")
```
%% Output
WARNING (theano.tensor.blas): Using NumPy C-API based implementation for BLAS functions.
%% Cell type:code id: tags:
``` python
# Wird später gebraucht
def hdi(a,b, prob = 0.95):
k = 0
x = np.linspace(0,1,100000)
y = st.beta.pdf(x,a,b)
while True:
k = k+0.0001
if np.sum(y[y > k])/np.size(x) < prob:
break
hdi_l, hdi_r = x[np.argwhere(y > k)][0][0] ,x[np.argwhere(y > k)][np.argwhere(y > k).size-1][0]
return hdi_l, hdi_r
def plot_beta(a,b):
x = np.linspace(0,1,1000)
y = st.beta.pdf(x,a,b)
hdi_l, hdi_r = hdi(a,b)
omega = (a-1)/(a+b-2)
plt.plot(x,y)
plt.plot([hdi_l, hdi_r],[.1,.1])
plt.text((hdi_l+hdi_r)/2, .5, "95 HDI", ha="center")
plt.text(hdi_l, .2, str(np.round(hdi_l,3)), ha="right")
plt.text(hdi_r,.2, str(np.round(hdi_r,3)), ha="left")
plt.text(0.2, st.beta.pdf(omega,a,b)-.5, "om="+str(np.round(omega,3)), ha="right")
```
%% Cell type:code id: tags:
``` python
trials = 20
head = 4
# unknown value in a real experiment
data = np.zeros(trials)
data[np.arange(head)] = 1
```
%% Cell type:code id: tags:
``` python
alph = 1
bet = 1
with pm.Model() as our_first_model:
theta = pm.Beta('theta', alpha=alph, beta=bet)
y = pm.Bernoulli('y', p=theta, observed=data)
trace = pm.sample(100)
```
%% Output
Only 100 samples in chain.
Auto-assigning NUTS sampler...
Initializing NUTS using jitter+adapt_diag...
Multiprocess sampling (4 chains in 4 jobs)
NUTS: [theta]
Sampling 4 chains for 1_000 tune and 100 draw iterations (4_000 + 400 draws total) took 2 seconds.
Sampling 4 chains for 1_000 tune and 100 draw iterations (4_000 + 400 draws total) took 4 seconds.
The acceptance probability does not match the target. It is 0.8820701017088629, but should be close to 0.8. Try to increase the number of tuning steps.
%% Cell type:code id: tags:
``` python
az.plot_trace(trace)
```
%% Output
Got error No model on context stack. trying to find log_likelihood in translation.
Got error No model on context stack. trying to find log_likelihood in translation.
array([[<AxesSubplot:title={'center':'theta'}>,
<AxesSubplot:title={'center':'theta'}>]], dtype=object)
%% Cell type:code id: tags:
``` python
az.summary(trace, hdi_prob=.95)
```
%% Output
Got error No model on context stack. trying to find log_likelihood in translation.
mean sd hdi_2.5% hdi_97.5% mcse_mean mcse_sd ess_bulk \
theta 0.224 0.086 0.068 0.384 0.007 0.005 155.0
ess_tail r_hat
theta 171.0 1.02
%% Cell type:code id: tags:
``` python
az.plot_posterior(trace, hdi_prob=.95, point_estimate="mode")
```
%% Output
Got error No model on context stack. trying to find log_likelihood in translation.
<AxesSubplot:title={'center':'theta'}>
%% Cell type:code id: tags:
``` python
plot_beta(a=alph+head, b=bet+trials-head)
```
%% Output
......
This diff is collapsed.
This diff is collapsed.