Personal tools
You are here: Home downloads Gem Documentation GemFAQ
Document Actions

GemFAQ

View entire FAQ in full Up to Table of Contents
Frequently Asked Questions

using Gem

How do I (???)

Many of the general usage questions are probably answered in the manual or release notes.

The pd mailing list is also a good place to find answers as well.

Why doesn't GEM run?

Notice that the -lib flag always requires Unix styles slashes. This is the case even on Windows.

You may also want to use the -nosound flag. For instance, my PC has problems using audio (it leaks memory), so I just turn off the audio part of Pd. However, other people can't get GEM to work if the -nosound is used (on Win95). You can also try the -nodac or -noadc flags (for digital-analog-conversion only and analog-digital-conversion only).

I've got it running. Now what?

Try out the manual. It will step you through the basics.

You will also want to look at the example files. Assuming that everything is installed correctly, you can get to the examples by going to the Help menu in Pd and selecting examples. A bunch of the patches should start with gem.

The best one is gem/01.basic/01.redSquare.pd It puts a red square up on the screen and allows you to rotate it.

gemImage.pd shows how to load in a TIFF file.

gem/03.lighting/04.moveSpheres.pd moves two spheres around the screen. Try the other ones. Most of the GEM objects have test patches which give some information about the various controls for the object.

On IRIX 5.3, why does GEM dump with an rld error? !

GEM only works under IRIX 6.2+. The rld error is probably something about not having glBindTextureEXT (or something). OpenGL 1.0 has some extensions to speed up texture mapping (which are an integral part of OpenGL 1.1). However, these don't exist on IRIX 5.3. If you recompile GEM (see the next question), things should work fine.

I don't have access to an IRIX machine, so don't expect any builds from me. Upgrading to IRIX 6.2+ is worth it.

This FAQ applies to: 0.70, 0.72, 0.73, 0.74, 0.75, 0.76, 0.77, 0.78, 0.79, 0.80, 0.81, 0.82, 0.83, 0.84, 0.85

Why can't I compile GEM on IRIX 5.3? !

There was probably an error saying that the compiler couldn't find the file "dmedia/vl_vino.h" in pix_videoSGI.cpp. IRIX 6.2+ adds new functionality to the media libraries which makes life much easier. You cannot compile pix_video or pix_indycam as is under 5.3. You can remove them from the Pix/Makefile and from the linker part of the global Makefile. You will also need to recompile the Td and Tiff libraries.

There shouldn't be any problems doing this. I haven't tried any of this, so if it works for someone, please let me know.

This FAQ applies to: 0.70, 0.72, 0.73, 0.74, 0.75, 0.76, 0.77, 0.78, 0.79, 0.80, 0.81, 0.82, 0.83

Why is GEM slow in general?

Examine what you are doing. If you are constantly changing textures, then this is probably your problem. If you have models with a million triangles, then this is probably the problem. Compare what you are doing with realistic specs on your system. Some systems slow down when they have to draw very large polygons (slow fill rate).

You can also turn on profiling to see how long it takes to render a frame. Send a profile message to the gemwin object. The number that is printed is the number of milliseconds one frame takes to render. 50 milliseconds is 20 frames per second. profile 2 is good if you want to see how long the image processing is taking.

  • profile 0 - turn off profiling
  • profile 1 - turn on profiling
  • profile 2 - turn on profiling and don't cache pixes

Why is GEM slow on IRIX? !

If you are having major slowdowns, then please let me know.

I have gotten very good performance on most machines (Indy, O2, Impact, Onyx2).

This FAQ applies to: 0.70, 0.72, 0.73, 0.74, 0.75, 0.76, 0.77, 0.78, 0.79, 0.80, 0.81, 0.82, 0.83

Why is GEM slow on WinNT/Win95/Win2k/WinXP/...?

You probably don't have hardware acceleration.

You can use software rendering, but it basically useless except for extremely basic patches.

You can get a good graphics accelerator for really cheap these days. I recommend a card based on nVidia's chipsets, such as the TNT2 or GeForce, but there are other companies such as 3dfx and Matrox.

Make sure that you are running the latest drivers for your card. The basic drivers that come with the cards are usually very bad.

Also, PCs don't deal with lots of texture maps very well (they are bus limited, at least until AGP), so if you are trying to use lots of constantly changing texture maps (especially with [pix_multiimage], [pix_video] or [pix_film]), that will cause problems.

Why is GEM slow on Linux?

It is because you have to use Mesa, which might be running in software.

Mesa is an awesome package by "Brian Paul"mailto:brianp@avid.com which "emulates" OpenGL.

Basically, it is a fully compliant OpenGL package, but it isn't officially sanctioned by the OpenGL ARB, such, it is doesn't have the OpenGL name.

There is an acceleration package for the many graphics card, but I don't know anything about it.

nVidia is being very supportive of Linux: their TNT2 and GeForce cards work under Linux with hardware-acceleration of openGL. (but the drivers are proprietary)

radeon cards should also be supported very well under linux (even with open-source drivers)

If I resize the window, everything looks strange.

GEM doesn't trap resize events in IRIX or Linux (this is not a problem in WinNT).

This means that OpenGL doesn't have the correct information to render properly.

If you want to resize the window, send a dimen x y message to gemwin before you create the window.

Can GEM run on a 3Dfx Voodoo card? !

I (this is: Mark Danks) have a Voodoo2 card, which runs fine under WinNT.

I use the OpenGL beta driver from 3Dfx at work all the time without any problems and, except that the Voodoo takes over the full screen, it seems to work fine.

You will need to download the OpenGL Beta driver from 3Dfx's web site at http://www.3dfx.com and put the !OpenGL32.dll into the same directory as pd.exe (NOT gem.dll). Debugging patches is much easier if you have two monitors, one for the 3-D card and one for the 2-D card.

IMPORTANT: You must set the environment variable GEM_SINGLE_CONTEXT = 1

to make the Voodoo card work. It will make a window 640x480 (which is the correct size for TV video out on my Canopus V2 card). On WinNT, right click "My Computer" and go to "Properties". On the "Environment" tab, you need to add the variable GEM_SINGLE_CONTEXT with a value of 1. Resizing the GEM window with a Voodoo card is not a great idea. The Voodoo card can only display certain window sizes and will clip the graphics.

For the tech heads in the audience...

I create an OpenGL context at startup and never actually display its associated window.

This means that GEM objects can create display lists, call OpenGL commands, etc. in their constructors, even if no window is actually being displayed. However, with the Voodoo card, there can only be one OpenGL context. So, instead of creating one context and just holding onto it in the background, I create the normal GEM window and associate the OpenGL context with it... and the user can never destroy or close that window.

This FAQ applies to: 0.70, 0.72, 0.73, 0.74, 0.75, 0.76, 0.77, 0.78, 0.79, 0.80, 0.81, 0.82, 0.83, 0.84, 0.85, 0.86, 0.87, 0.888

Will GEM support hardware transform and lighting (T&L) ?

Absolutely!

Unlike some other APIs, OpenGL will automatically use hardware accelerated transform and lighting if the card has it.

GEM gets great performance from cards like nVidia's GeForce.

I get an error "GEM needs Truecolor visual support".

This error means that your X display is running with paletted colors, which is the result of limited color depth. If you start the X display with startx !-- -bpp 16 or some higher number, then it should work fine. 32-bit color is the best.

Why does Gem crash when creating the Gem-window? !

When I try to create a Gem-window, my X-server crashes (or worse)? btw, I am using Ubuntu/hoary with fglrx drivers.

quick

try setting the environment variable GEM_SINGLE_CONTEXT to 1

explanation

Gem establishes an (invisible) openGL-context at startup, even if no Gem-window is created yet. When you create the Gem-window, a 2nd openGL-context is used (which shares some properties with the invisible context).

For some reasons this seems to be not possible with some gfx-drivers (e.g. ATI's proprietary fglrx drivers) and some window-managers (i suspect compiz/beryll), leading to crashes of the X-server and/or system freezes.

The current workaround is to set the environment variable GEM_SINGLE_CONTEXT=1, which prevents this dual-context magic.

how?

  • if you are starting Pd from the bash, you can start Pd/Gem with GEM_SINGLE_CONTEXT=1 pd -lib Gem
  • on bash, you can also set this permanently by adding a the line export GEM_SINGLE_CONTEXT=1 to either ~/.bashrc (the bash-configuration file in your home-directory) or to /etc/bash.bashrc (to set it for all users)

This FAQ applies to: 0.91

Why does Gem crash when sending [destroy( to [gemwin]?

After working a bit with Gem with my intel-graphics card, I incidentally closed the Gem-window by sending a [destroy( message to the [gemwin] object. This crashed Pd with a "Segmentation Fault".

Hmm, even though this question is frequently asked, there is no satisfying answer to it yet. Chances are high that something will change if you use the GEM_SINGLE_CONTEXT environmental variable, as described here

This FAQ applies to: 0.91.1

Why does Gem crash when closing the Gem-window? !

I made am running Gem in fullscreen mode. In order to be able to still access my patch after the Gem-window has taken full control over my desktop, i use [gemmouse] to trigger a [destroy( message whenever i click the right mouse-button. While this closes the Gem-window, it also crashes Pd :-(

This happened with older versions of Gem, and should be fixed at least since 0.91. Please upgrade!

The reason for this is, that [gemmouse] sends out all the events immediately when they appear, which is on the stack of a method from the Gem-window itself. Destroying the window in this instance of time will invalidate the stack. A quick fix is to use a [delay 0] message between the trigger event and the [destroy( message.

But again, this hack is unnecessary with recent versions of Gem.

This FAQ applies to: 0.888

by IOhannes m zmoelnig last modified 2007-07-18 11:49 AM

Powered by IEM Powered by Plone Section 508 WCAG Valid XHTML Valid CSS Usable in any browser