[REBOL] Re: Image processing and embedding metadata in images?
From: media:quazart at: 13-Nov-2001 8:42
I just thought I'd poke in this discussion with a little bit of my two
cents... and some interesting metadata usage...
just to add a bit to metadata useage, now even hardware is starting to
include meta data directly in chips that are included in the casing... the
lastest video tapes and archiving tapes from high-end companies (like sony
dtf-2 archiving tapes) have eeproms which are inside the tapes to store the
tape's directory instead of having it on tape only... this improves access
times considerably and when the tapes are used in tape libraries, it allows
them to cache the structure instantly without having to rewind the tape
everytime it is removed/inserted into the tape robot...
some of the most advanced data tracking systems now use on-tape
magnetic-type stickers which are detected by sensors at each door of a
building's production facility... this allows a producer to track his tape
in the building at any moment.
SO given a common task that a clip is needed on the tape in short notice
(a late breaking news spot, for example), the director can browse the
company's image library (everyting is references by metadata) and then the
system tells him where the clip is physically in the building and in what
formats... this, even if the tape has left the archives and the guy (or gal
;-) with the tape is taking a break in the cafeteria.
On anotherr level, Some systems now even do image testing to detect image
similarities and will return VERY convincing results when asked to find
similar images... the fastest systems use colorspace evaluation and this
works pretty well. An example of what they do is store a histogram-type
image analysis which tells the system how much red or luminance (amongst
MANY other things) are in the image or any precise anomaly. When you look
for similar images, it will cross-reference the current image with this
image analysis metadata and return (within a preciseness threshold ) any
image which seem to hold about the same analysis or some peculiarly similar
detail... surprisingly enough, if you look at an image shot at a specific
date and time, most probably, it will return all images of the same subject
shot within the same lighting conditions. Its quite impressive to see
so, someone pointed that its better to link to the meta data... I'm not so
sure... corruption is a dangerous possibility with linking and agencies
cannot have a system which simply _could_ corrupt itself... if the metadata
is stored right in the image, its MUCH safer and in the case of a system
crash, it just has to re-index images in order to re-build its data
structure. I think most systems will use a two-level structure where the
data is within the images AND in a "live" (linked) database.
Here is one solution, if your application is a closed circuit one and your
application only needs to serve a precise boundary (inside your office or
agency, for example). You can embed meta-data in pretty much every image
(the saved format must not be lossly of course... so JPEG is out ;-) by
adding some binary data within a region opf the image which is beyond the
bounds of the image... just stamp the image on a bigger canvas and encode
your binary data as rgb data in this extra space. For a more invisible (and
faster) approach, you could use a format with alpha channel and encode your
meta data in there! you just need to be able to poke/peek into the rgb
array and then its quite easy to make little codecs which dump and get data
to the rgb (or alpha) channel/buffer/raster/whatever.
I used to do it with a VIC-20 style system... so I think its still possible
to do it now ;-)
my 1 cent (2cents worth of canadian money)