Wednesday, 23 February 2022

Infra-Red, Infra-Blue, Infra-Green.. or Eat Radioactive Carrots to See in the Dark..

 Hi WWW

Years ago I came with a system to enhance infra-red vision.. it always annoyed me that spacesuits let radiation in via the glass in the helmet.. also that this is the weakest part of the suit, if a rock flies up and cracks the glass, well that equals death.

This is dumb.

So in another article I did on my Scottish Politics blog (scotspol) I described a system for seeing while the helmet is enclosed in metal.

The camera that can see through sandstorms which then generates a basic graphical world for the display inside the helmet.

How Does It Work?

"An infrared intemperature probe measures temperature by detecting the infrared energy emitted by all materials which are at temperatures above absolute zero, (0°Kelvin)."

Credit: Omega 

We have infra-red technology to see heat signatures above a certain temperature this is then translated in software to a visual output we can understand.

It's that second bit.. the translation to what we can understand that we can manipulate to give someone the ability to see through sandstorms on say.. Mars.

The temperature for any heat (we set the parameters) is expressed as infra-red.. well not exactly.. it needs to radiate heat, but since we can manipulate, what is requirement for that temperature.. even cold gives off a wavelength of light which we show as blue or black.. we would be able to make an 'infra-blue'.. I'm explaining this badly..

Say.. on Mars at night in a sandstorm, we have wavelengths of light that register -85C and it gives off infra-red. We translate this as blue in the software. So now the ground is blue. The Sandstorm itself would be a different temperature, probably higher say -65C.. in the software we make this light brown, but give the ability to disable it.

Night vision.. we combine night vision with the 'infra-blue' (infra-blue is just a name to understand what it is doing.. it doesn't mean it is this scientifically).., and so we get more output to software increasing the spectrum the software can translate to the display..

And again with different temperature objects.. and so on.

Display

A powerful GPU is used to create a 'world' like a video game on the display with the data amassed from the camera/s.

Realtime translation of visuals on an internal display in the helmet. a Totally metal helmet.. the design of which I have been imagining Halo master chief but with a metal front.

Choose live feed or combination of graphics and live feed or totally graphics.. all done in software. If its night on Mars.. you could set the software to show the terrain as day time and using the input data it simulates this for you.

Can We Do This Today or is This Sci-Fi?

Well .. it'd be very basic right now, but yes we can do this today.. the GPUs to make it look live feed from software don't exist in the extent which they would in the future.. but yes we can have a reasonable if a little janky version of this up and running..

Yes's and No's

No we can't have a screen that curves inside the helmet.. we are working on that tech just now, but as of now its defo not ready and breaks very easily.

Yes we can put three hard displays, left, centre, right, in the helmet and it would work.

Yes we have the cameras.

No.. the power for the suit.. wouldn't give us much time.. well it would take quite a bit of power and we're not there yet. Even with high spec batteries etc.. we would probably be looking at a mini H3 reactor (Helium 3) some lithium batteries and probably.. some mini Stirling engines.

Yes, with lithium batteries we could get a couple of hours.. but like I'm saying this is janky and needs more money and research done.. we need the suit to operate in excess of 24 hours.. and 2h just isn't enough. I am imagining having to trek home at night in a sandstorm.. of course oxygen is a concern.. but I will guess we will get good at tapping perma-frost for oxygen.. like people tapped for water 1000 years ago, the suit just needs a miniature pressurization in/out seal.

In a pinch a couple of hours is better than nothing.

Yes.. we have the metals.. titanium.

No.. we don't have GPUs power efficient enough and powerful enough to bring the idea to full fruition and unlock its potential.. realtime realism. 

Yes.. we can realtime make a plain of a world and populate it with the input data objects and colour them as we please in the software. Even cell shaded.. basic.

We don't have the translation software because.. I'm presenting this idea here for the first time that I know of.

Wish I could talk to someone at.. JPL or idk MIT.

peace

Dava


No comments:

Post a Comment

Breast Cup Sizes.. my thoughts on standardization..

 Hi After somewhat arguing with my OH.. I have had many thoughts on breast cup size. Like right now, there is vague ambiguity.. people don&#...