- #1
bitsnbytes
- 2
- 0
Problem: I'm trying to optimize an inexpensive digital camera with a CCD sensor for NIR video. As of right now (after removing the IR filter) it doesn't work well in the dark at all (even with a basic visible light filter ... obviously). I have to direct at least a little bit of regular light or using the remote control (IR LEDs) to illuminate an object in complete darkness to see it.
What I'd like to do now is figure out how much IR energy I need to direct at an object for a given distance to be able to make out what it is?
Second part is to figure out camera sensitivity (I figure its between 780nm and 1200nm which is why heat won't show up (unless I heat a knife glowing orange wait for it to cool just enough to the point that you can't see it in the dark and I get my white spot indicating heat source)) but how can I tell with certainty that it is between x um and y um?
I figure camera sensitivity ultimately defines IR intensity required.
My goal is to be able to clearly see outlines of people and vehicles in a basic surveillance application. (I cannot afford a camera with a microbolometer image sensor right now.)
I have seen IR Illuminators that go 300meters, but I need to hit at least 1,000ft - 2,500ft (size and weight is a major issue)...and again I need to figure out how much energy needs to be concentrated in a given area at anyone time (distance) to have my minimum usable quality of video (camera dependent).
Most important question, how much IR energy is considered lethal? People will be unknowingly exposed to IR during operation.
Any help and guidance to theory behind your answer would be greatly appreciated!
The knowledge will be used towards a project focused on developing technology to assist a non-profit criminal profiling agency in its work to recover abducted children.
This is by no means my area of expertise!
What I'd like to do now is figure out how much IR energy I need to direct at an object for a given distance to be able to make out what it is?
Second part is to figure out camera sensitivity (I figure its between 780nm and 1200nm which is why heat won't show up (unless I heat a knife glowing orange wait for it to cool just enough to the point that you can't see it in the dark and I get my white spot indicating heat source)) but how can I tell with certainty that it is between x um and y um?
I figure camera sensitivity ultimately defines IR intensity required.
My goal is to be able to clearly see outlines of people and vehicles in a basic surveillance application. (I cannot afford a camera with a microbolometer image sensor right now.)
I have seen IR Illuminators that go 300meters, but I need to hit at least 1,000ft - 2,500ft (size and weight is a major issue)...and again I need to figure out how much energy needs to be concentrated in a given area at anyone time (distance) to have my minimum usable quality of video (camera dependent).
Most important question, how much IR energy is considered lethal? People will be unknowingly exposed to IR during operation.
Any help and guidance to theory behind your answer would be greatly appreciated!
The knowledge will be used towards a project focused on developing technology to assist a non-profit criminal profiling agency in its work to recover abducted children.
This is by no means my area of expertise!