waldi
05-31-2002, 04:03 PM
I'm having troubles with getting libEQ.a to work, and this is the weird part-
I looked in the main.cpp source file for the error message i was seeing ("Disabling decoder due to showeq.conf preferences") and found that a few lines above it the "broken decode" flag was being forced to "1". this is odd because the line previous to that was commented out, but looked like it actually looked for the "broken decode" option in the preferences file.
So, based on this, it looks like there's no way for it to actually go to the libEQ.a file for anything, because it always thinks the broken-decode option is set.
Does this seem right?
I changed the "= 1" to an "= 0", re-made and ran the program, and it stopped barking at me about a broken decoder. However, i still wasn't getting any spawn information.
My libEQ.a md5sum is correct, and the ./configure worked just fine (checking for ProcessPacket in -lEQ... yes)
I looked in the main.cpp source file for the error message i was seeing ("Disabling decoder due to showeq.conf preferences") and found that a few lines above it the "broken decode" flag was being forced to "1". this is odd because the line previous to that was commented out, but looked like it actually looked for the "broken decode" option in the preferences file.
So, based on this, it looks like there's no way for it to actually go to the libEQ.a file for anything, because it always thinks the broken-decode option is set.
Does this seem right?
I changed the "= 1" to an "= 0", re-made and ran the program, and it stopped barking at me about a broken decoder. However, i still wasn't getting any spawn information.
My libEQ.a md5sum is correct, and the ./configure worked just fine (checking for ProcessPacket in -lEQ... yes)