Multispectral Imaging and Camera

Developments in machine learning and remote sensing <e.g. multispectral imaging; multispectral cameras> is making image analysis feasible and scalable for use in crowdsourcing, citizen science, and consumer products. Sensing the earth’s atmosphere using high-resolution ground-based sky cameras can provide a cheaper, faster, and more localized manner of data acquisition. At the same time, lead to new use cases and markets to make cheaper, further scale, and make feasible use of multispectral cameras.

Because of the inherent high-dimensionality of the data from multispectral cameras and multispectral imaging, it is expensive to directly use raw data for analysis. By combining Weather Flow with NASA Globe Cloud https://www.globe.gov/web/s-cool/home/participate, Weather Flow and Weather Flow community could be the key hardware, crowdsource, and citizen science platform/ ecosystem/ partner that results in extensive experimental results, build new datasets, and enhance old datasets in segmentation, classification, and denoising of sky and cloud images for weather conditions and forecasting.

For example:

As such, if Weather Flow has not already reviewed the feasibility to incorporate multispectral imaging and camera, I request that Weather Flow incorporate multispectral imaging and camera based on the state of sensors and machine learning.

First, based on “soon-to-be cheap cloud camera,” I do not think you have a multispectral camera. Multispectral cameras not cheap and have not become common place in consumer products or reached the point to become “soon-to-be cheap cloud camera” for consumers. Even so, you might be able to setup filters to achieve multispectral imaging. A few examples with some pros and cons of using filters:

  1. Rotating filter wheel
    • Pros: Large number of filters available, high resolution, and lower cost
    • Cons: Static Subject, large filter wheel assembly
  2. Single, dual, and multi bandpass filters using multiple cameras
    • Pros: Application-specific filters, high resolution, and lower cost
    • Cons: Quantity of Cameras required proportional to wavelengths

Second, many consumer companies include “cloud” or “cloud camera” in marketing materials and product names to denote camera utilize cloud storage (sending footage to a server that you can access via smartphone) over local storage (saving footage on a memory card).

For general background on multispectral imaging and camera, I would recommend checking out video titled " What Is Multispectral Imaging? – Vision Campus" at https://www.youtube.com/watch?v=b0webdvlySo.

Who should Weather Flow partner with to develop camera?

I recommend Gigajot Technology based on the people behind the startup and what they have done so far <e.g. https://www.laserfocusworld.com/test-measurement/research/article/16555293/advances-in-detectors-the-quanta-image-sensor-qis-making-every-photon-count.>

Timeline to develop hardware and feature?

  • 3 years to develop hardware for feature based on “HyperCam” as a point of reference and example.

  • 3 to 5 years to develop machine learning for feature based on state of computer vision, status of NASA Globe Cloud, state and classification of cloud formation for weather status and forecasting, and time to alpha test dataset(s) + algorithm(s) to beta status. Another Reason for 3 to 5 years to develop machine learning for feature can be better explained by watching Professor Fei-Fei Li explain “How we teach computers to understand pictures” at https://www.youtube.com/watch?v=40riCqvRoMs.

“* 3 years to develop hardware for feature based on “HyperCam” as a point of reference and example.”

Reason for HyperCam reference and example, was a low-cost hardware implementation of hyperspectral imaging around 2015 https://ubicomplab.cs.washington.edu/pdfs/hypercam.pdf and scalable as smartphone add on for about $50.00 based on 2015 supply chains.

Not sure what is causing my posts to be flagged and who is flagging my account. Even so, reply and/ or further details:

Reason for ground based imaging and use of multispectral imaging:

  1. Deliver more accurate weather forecasts.
  2. Improve rainfall and satellite precipitation estimates.
  3. Investigate cloud structures, improve understanding of cloud formations, correct cloud datasets to reflect reality, and ensure weather datasets and models accurately reflect cloud structures, how clouds form, and cloud impact on atmosphere.
  4. Optimizing communication links.

Unlike satellite imagery and aerial photographs, ground based imaging of cloud would be able to provide sufficient or better temporal and/or spatial resolution for localized and short-term cloud analysis over a particular area. Ground based imaging of clouds would also offer a lower cost alternative to satellite imagery and aerial photographs. At the same time, images obtained from ground based imaging combined with satellite imagery and aerial photographs would provide high-resolution and hyper local data and datasets about local cloud formation, movement, and other atmospheric phenomena unlike before.

Not sure what is causing my posts to be flagged and who is flagging my account. Even so, reply and/ or further details:

Going back to earlier post and posting again for reference: “* 3 years to develop hardware for feature based on “HyperCam” as a point of reference and example. Reason for HyperCam reference and example, was a low-cost hardware implementation of hyperspectral imaging around 2015 https://ubicomplab.cs.washington.edu/pdfs/hypercam.pdf and scalable as smartphone add on for about $50.00 based on 2015 supply chains.” Weather Flow has demonstrated with “Wind & Weather Meters” the institutional capabilities and unique knowledge combined with patience to formulate products and courses of actions that lead to feasible ways forward to making crowdsourcing ubiquitous immersive. As such, they could start with camera add to start ground based imaging to improve upon NASA Globe use of smartphones cameras.

Not sure what is causing my posts to be flagged and who is flagging my account. Even so, reply and/ or further details:

Reason for reference and partnership with NASA Globe. Clouds are powerful agents of global change. They affect the overall temperature of the Earth and play a large role in controlling the planet’s climate. NASA and others need more accurate data on clouds to understand their impact.

NASA and other space agencies have a number of satellites orbiting the Earth and collecting data about Earth’s atmosphere. While satellites give a big picture of what’s going on, they sometimes have trouble with the details.

NASA has proven the need for crowdsourcing weather data by forming public-private partnerships. At the same time, NASA has proven the value of collecting data by citizen scientists and STEM programs to better understand the different types of clouds and the effects they have on our Earth’s climate. Satellites only see the top of the clouds while you see the bottom. By putting these two vantage points together, we get a much more complete picture of clouds in the atmosphere.

Yes, I know that and mentioned again just in case post is removed. Do not forget I also mentioned who Weather Flow should partner with to develop multispectral cameras: “Who should Weather Flow partner with to develop camera? I recommend Gigajot Technology based on the people behind the startup and what they have done so far <e.g. https://www.laserfocusworld.com/test-measurement/research/article/16555293/advances-in-detectors-the-quanta-image-sensor-qis-making-every-photon-count.>” Not sure if you know this, unlike visible cameras, multispectral cameras do not require “LEDs with different wavelengths” to capture the environment.

Also if you completely read the PDF I linked to about HyperCam, HyperCam is an example of how to separate and scale multispectral cameras as an add on.

Not sure how I am being vague. Do you understand the difference in what is captured by visible cameras (400~700nm) vs multispectral cameras ((Coastal aerosol in band 1 (0.43-0.45 um) Blue in band 2 (0.45-0.51 um) Green in band 3 (0.53-0.59 um) Red in band 4 (0.64-0.67 um) Near infrared NIR in band 5 (0.85-0.88 um) Short-wave Infrared SWIR 1 in band 6 (1.57-1.65 um) Short-wave Infrared SWIR 2 in band 7 (2.11-2.29 um) Panchromatic in band 8 (0.50-0.68 um)
Cirrus in band 9 (1.36-1.38 um) Thermal Infrared TIRS 1 in band 10 (10.60-11.19 um) Thermal Infrared TIRS 2 in band 11 (11.50-12.51 um))?

All lot of data is missed with just using visible cameras. So why we need multispectral cameras is because researchers use, and value citizen science cloud data because it helps to validate data from Earth-observing instruments. Scientists at NASA and other space agencies work with a suite of six instruments known as the Clouds and the Earth’s Radiant Energy System (CERES.)

Even though CERES’ instruments use advanced technology, it is not always easy for researchers to positively identify all types of clouds in their images. For example, it can be difficult to differentiate thin, wispy cirrus clouds from snow since both are cold and bright; even more so when cirrus clouds are above a surface with patchy snow or snow cover. One solution to this problem is to look at satellite images from a particular area and compare them to data submitted by citizen scientists on the ground.

Looking at what an observer <person(s) and/ or personal weather station(s)> recorded as clouds and looking at their surface observations really helps us better understand the images that were matched from the satellite. Such observations would be enhanced if the data is able to fully validate Clouds and the Earth’s Radiant Energy System (CERES) instruments and observation from ground based observations.

Not sure what is the reason and/ or cause for my posts to be deleted or majority of my posts to be flagged combined with person that is flagging my account is not coming forward with how I could address SPAM and/ or how I am violating Weather Flow guidelines accusation(s.) Even so, reply and/ or further details:

Ok. You must be joking or playing me?

  1. “good for what comes from a satellite system”

“visible cameras (400~700nm) vs multispectral cameras ((Coastal aerosol in band 1 (0.43-0.45 um) Blue in band 2 (0.45-0.51 um) Green in band 3 (0.53-0.59 um) Red in band 4 (0.64-0.67 um) Near infrared NIR in band 5 (0.85-0.88 um) Short-wave Infrared SWIR 1 in band 6 (1.57-1.65 um) Short-wave Infrared SWIR 2 in band 7 (2.11-2.29 um) Panchromatic in band 8 (0.50-0.68 um)
Cirrus in band 9 (1.36-1.38 um) Thermal Infrared TIRS 1 in band 10 (10.60-11.19 um) Thermal Infrared TIRS 2 in band 11 (11.50-12.51 um))”

Not limited to a satellite system.

  1. <I think one of your posts has been flagged -> deleted based on while typing this reply no longer in thread> Also you write “Not sure if you know this, unlike visible cameras, multispectral cameras do not require “LEDs with different wavelengths” to capture the environment.” and you are right about that. However the hypercam in the pdf uses a single band camera, with a very broad band. It does use LEDs to require capturing the objects at different wavelengths. (this is the third time I write that, please read the article yourself!). This hypercam is useless for taking pictures of clouds"

You need to read my replies and you do not understand multispectral imaging and camera.

For example, if you understand multispectral imaging and camera, you would have never had posted anything about such systems requiring LEDs for illumination.

For example, if you understand multispectral imaging and camera, you would have never had posted anything about such systems limited to satellite system.

Regarding HyperCam, I posted multiples times ““HyperCam” as a point of reference and example” <e.g. “add on for smartphone”> which you fixate on while you ignore rest of the replies <e.g. "who Weather Flow should partner with to develop multispectral cameras: “Who should Weather Flow partner with to develop camera? I recommend Gigajot Technology based on the people behind the startup and what they have done so far <e.g. https://www.laserfocusworld.com/test-measurement/research/article/16555293/advances-in-detectors-the-quanta-image-sensor-qis-making-every-photon-count.>.>”

  1. <I think one of your posts has been flagged -> deleted based on while typing this reply no longer in thread, see screenshot above> "bands you are listing are used by satellites, not by ground base stations. My question still stands “What wavelengths do you wish to have images from the clouds, what do these wavelengths have to add in extra info? why multi-spectral.” eg, what extra bands besides the obvious rgb do you want to use. As mentioned, I could see some use for infra red to see the clouds at night, but beyond that what is it good for.
    And yes satellites might have problems differentiating clouds from snow. but the best way to augment that observation from a ground based system is simply reporting if there is snow :-). Tell us what a ground based multi-spectral system would add compared to normal rgb images.

For example, you do not understand multispectral imaging and camera, “bands you are listing are used by satellites, not by ground base stations” has no bases in reality.

Multispectral imaging and camera is not limited to satellites, have you ever interacted with geologists? volcanologists? oceanographers? astronomers? biologists?.. have you ever used a multispectral camera? … have you ever dealt with the difference in data captured by multispectral camera vs visible cameras when trying to understand a space and/ or environment and/ or object later on <e.g. not able to physically visit again.>

  1. <I think one of your posts has been flagged -> deleted based on while typing this reply no longer in thread, see screenshot already provided> “And yes satellites might have problems differentiating clouds from snow. but the best way to augment that observation from a ground based system is simply reporting if there is snow :-).”

You must be playing me?

Seriously? “best way to augment that observation from a ground based system is simply reporting if there is snow :-).”

Not all data points and views required for observation can be done “simply by reporting it” for host of reasons <e.g. multiple trips is hazardous; limited capital and human resources; lessons learned and reasons to augment human observation with remote sensing.> On that note or train of thought, why do you have a personal weather station <e.g. Weather Flow>?

“satellites might have problems differentiating clouds from snow.” So just because something has problems differentiating something remote sensing or multispectral imaging and camera should not be used?

  1. “Tell us what a ground based multi-spectral system would add compared to normal rgb images.”

Since reply “All lot of data is missed with just using visible cameras. So why we need multispectral cameras is because researchers use, and value citizen science cloud data because it helps to validate data from Earth-observing instruments. Scientists at NASA and other space agencies work with a suite of six instruments known as the Clouds and the Earth’s Radiant Energy System (CERES.) Even though CERES’ instruments use advanced technology, it is not always easy for researchers to positively identify all types of clouds in their images. For example, it can be difficult to differentiate thin, wispy cirrus clouds from snow since both are cold and bright; even more so when cirrus clouds are above a surface with patchy snow or snow cover. One solution to this problem is to look at satellite images from a particular area and compare them to data submitted by citizen scientists on the ground. Looking at what an observer <person(s) and/ or personal weather station(s)> recorded as clouds and looking at their surface observations really helps us better understand the images that were matched from the satellite. Such observations would be enhanced if the data is able to fully validate Clouds and the Earth’s Radiant Energy System (CERES) instruments and observation from ground based observations.” is not good enough, since camera add on for smartphones with multispectral capabilities is not good enough starting point or point of reference because to “measure using more then one band (ignoring the obvious rgb bands)” is high questionable, value of data that is not captured and denoising such data is to good enough, and building upon NASA Globe Cloud program is not good enough <e.g. come closer to fully validate NASA data.> I am not sure how I can answer any of your questions any further.

Bottom line, you see multispectral imaging and camera as questionable, limited to satellite systems, and you do not understand multispectral imaging and camera.

FYI. NASA take on NASA Globe Cloud program. Imagine if a future version of Weather Flow Weather Station had a multispectral camera combined with machine learning, maybe with 4G and/ or satellite link integration too, help augment and easily fully validate NASA data <e.g. match ground based observations with the overpass of a cloud observing satellite.>

This whole discussion would probably have a better audience at WXforum.net

Thanks for the feature suggestion, @roy_mitsuoka! It’s certainly an interesting idea for a future hardware device. We’ll keep it in mind.

PS: It looks like some of your posts were being auto-flagged by the system as potential spam because of the high volume and number of external links. I’ve restored most of them - there may be some redundancy in there now.

“thanks but no thanks.”

You should have started your first reply with that.

Seriously? “insult people doesn’t really help, does it.”

How did I insult people? What people have I insulted? How did I insult you?

I based “Bottom line, you see multispectral imaging and camera as questionable, limited to satellite systems, and you do not understand multispectral imaging and camera.” and similar statements <e.g. “you do not understand multispectral imaging and camera”> based on your replies and questions. If you know about and have worked with multispectral imaging and camera, why would you ask the following questions or post the following statements:

  1. “how do I turn my soon-to-be cheap cloud camera into a multispectral one?”
  2. “Indeed I don’t have a multi spectral camera.”
  3. “useless for weather purposes,”
  4. “useless for weather purposes, as here the spectral feature is created by the led illumination, not by the camera.”
  5. “keep in mind what you want to achieve with a multi-spectral camera for a ground based weather station? Note that this is different from satellites.”
  6. “ground based station basically only sees blue sky or clouds.”
  7. “spectral part comes from illuminating the object with LEDs with different wavelenghts.”
  8. “frequency (band) is good for what comes from a satellite system. It doesn’t apply to ground based systems.”

“With respect to point #1. Those ARE useful for satellite imaging systems, but are not so useful for ground based systems. Your comment “Not limited to a satellite system” doesn’t answer any question I had. Tell us what info you would get from a ground based system using spectral imaging, that you wouldn’t get by using RGB (and possibly infra red).”

I posted “Not limited to a satellite system” because you posted several false statements about multispectral imaging and camera <e.g. see 1. to 8. examples above.>

“point #2. I do understand spectral imaging perfectly well. You are the one that posted a reference to hypercam. That one is not a true multi spectral camera. It uses a trick with LED. And yes I’m ignoring the rest (allthough I looked at the article), you first need to make sense of why and how!”

I do not how many times I have to explain and state HyperCam is an example of how to make a multispectral camera as an add on for smartphones based on 2015 supply chains with the actual sensor being done by company like Gigajot Technology.

“point #3, I repeat, I perfectly do understand what is needed for a real spectral imager, how to process the data and how it can be useful. You just can’t tell me what specifically is useful for ground based spectral imaging (besides rgb and infra red). But I’m just repeating myself as you don’t answer any question. Give me a reference to some article using ground based camera for cloud photography.”

If “”point #3” is true, how do you not understand my HyperCam + Gigajot Technology reference. The people behind HyperCam were able to scale their product as camera add on for smartphones based on 2015 supply chains for about $50.00. Minus the competent your fixated on and would be cheaper today. The supply chains required for HyperCam is similar to what would be needed to make an affordable multispectral cameras as part of Weather Flow Weather Station.

“point #4 you are the one that brought up the example of satellites having problems with snow detection. It doesn’t help in any obvious way to supply nasa with a ground based spectral image of the clouds (they might be happy with a rgb picture of clouds, so the know there are clouds).”

You asked for examples.

Seriously? “It doesn’t help in any obvious way to supply nasa with a ground based spectral image of the clouds (they might be happy with a rgb picture of clouds, so the know there are clouds).”

“point #5. “One solution to this problem is to look at satellite images from a particular area and compare them to data submitted by citizen scientists on the ground. Looking at what an observer <person(s) and/ or personal weather station(s)> recorded as clouds and looking at their surface observations really helps us better understand the images that were matched from the satellite” That’s true, but normal rgb images from the ground would answer that question, no multi spectral ground based imaging is needed.”

You left out the sentence “Such observations would be enhanced if the data is able to fully validate Clouds and the Earth’s Radiant Energy System (CERES) instruments and observation from ground based observations.”

Again, if you understand multispectral imaging and camera, in order to “fully validate Clouds and the Earth’s Radiant Energy System (CERES) instruments and observation from ground based observations.” requires close to similar datasets which can not be achieved with just visible cameras.

Thank you @dsj. Given impact of climate change and growing occurrences of severe weather incidents and impacts on community resilience and resources, we as community of weather nerds need to ensure no blind spots to ensure weather services cover as many areas as possible and as many people as possible to ensure awareness of the environment 24/7 and make man made impacts more transparent and clear given impact of bots and partisans.

@vreihen the reason I choose Weather Flow instead of WXforum is due to applicability of “Multispectral Imaging and Camera” based on design choices being made by Weather Flow.