HTML doesn't specify any standard for audio (or video, for that
matter, or any media). I could write an HTML file that references any
type of soundfile I want, and as long as I'm reasonably sure that
my audience has its MIME-type matched to some sort of external player,
then I can do anything. Same for images - in fact I could even inline
any type of image I want, it's simply that NCSA Mosaic and most other
browsers will only inline GIF's and XBM's for the time being.
I think we should allow a similar flexibility, though I don't think
we need to mandate sound capacity in browsers at this initial stage.
> > One thing different from HTML's dealing
> >with audio is that we might want to be able to loop audio.
> For sure, and have audio that responds to location?
And direction, and decay, and echo, and... it's a *huge* problem,
depending on how realistic one wants to get.
> > Does anyone have
> >any thoughts on this? What other non-geometry things might we want to imbed
> >(not link to) in a VRML file?
> Should we allow bit mapped faces? I think so, and I do not know how such
> bit maps should be handled. What about text on a face? Maybe even HTML
> embedded on a face of a poly. If we are to allow bit mapping, maybe even
> simple animations.
Bitmaps could/should be handled in the same way as they are currently
in most web browsers - i.e., people could run with "auto-load bitmaps
off" and when they "pick" a square representing a bitmap, it would get
downloaded and materialize. This is more of a browser issue, though -
all we have to be able to say in the language is "this polygon is
mapped by this bitmap" and we're fine.