- #1
rynlee
- 45
- 0
Hi All,
so I'm trying to tackle this DEQ:
f''[x] = f[x] DiracDelta[x - a] - b,
with robin boundary conditions
f'[0] == f[0], f'[c] == f[c]
where a,b, and c are constants.
If you're curious, I'm getting this because I'm trying to treat steady state in a 1D diffusion system where I have homogenous generation along the length (b, in 1/(length-time) units), f(x) is the population distribution, and I have a point scatterer at x=a consuming population at a rate proportional to the concentration there (f(x)). i.e.
f=f(x,t)
df/dt = D*(d^2/dx^2)f + b - f*DiracDelta(x-a) = 0
I tried to take a laplace transform approach but couldn't hack it, if someone has another idea on how to approach this I'd appreciate it!
Thanks!
so I'm trying to tackle this DEQ:
f''[x] = f[x] DiracDelta[x - a] - b,
with robin boundary conditions
f'[0] == f[0], f'[c] == f[c]
where a,b, and c are constants.
If you're curious, I'm getting this because I'm trying to treat steady state in a 1D diffusion system where I have homogenous generation along the length (b, in 1/(length-time) units), f(x) is the population distribution, and I have a point scatterer at x=a consuming population at a rate proportional to the concentration there (f(x)). i.e.
f=f(x,t)
df/dt = D*(d^2/dx^2)f + b - f*DiracDelta(x-a) = 0
I tried to take a laplace transform approach but couldn't hack it, if someone has another idea on how to approach this I'd appreciate it!
Thanks!