Discussion:
Audio Range and Zero-crossings, Performance, Hashes and Future DIrections
(too old to reply)
Veli-Pekka Tätilä
2007-03-04 13:30:37 UTC
Permalink
Hi list,
I'm new here. I'm a 20 something sight-impaired programmer and musician from
Finland. I've got experience using analogs both virtual and real and know
how to build Reaktor patches as well as program in Java, Perl and C. Now
I've picked up the basics of ChucK and read through the PDF manual that
comes with the program. Looks good and certainly a great deal more
accessible than recent versions of Reaktor if you're using magnification and
a screen reader program. I've got a flurry of questions I'll ask here.

ChucK says it is:
chuck version: 1.2.0.7b (dracula)
exe target: microsoft win32

1. What's the range of the audio float datatype and what would be the best
way to detect zero crossings? My aim is to write a simple app that counts
the samples in the low and high phases of a pulse wave and prints out the
pulse width whenever it changes. If I can also get MIDi input into the app
it would be quite easy to determine how pulse width in percentages changes
as the function of the pulse width knob in my virtual and real analogs,
whose exact values are not documented.

Here's some prototypical code (this is my first real chucK script):

Code:
100::ms => dur blockSize; // Processing resolution.
until(adc.last() < 0) // Measure low-phase first.
blockSize => now;
// SAmple counters:
0 => int positive;
0 => int negative;
while(true)
{
adc.last() => float sample;
if(sample > 0)
++positive;
else if(sample < 0)
{
++negative;
if(positive > 0) // Measured at least one cycle.
<<< "Width: ", 100 * negative / (negative + positive) >>>;
0 => positive => negative; // Reset counters.
} // else if
// Ignore the pure 0 value.
blockSize => now;
} // while
End code.

However, when I run this, the app doesn't ever seem to get past the until
loop. I'm assuming here that samples are floats or doubles from -1 to 1 as
in VST, as I didn't find the range in the manual. Is this correct? IF not,
it's no wonder the code won't work, <smile>. Of course the rather grainy
test processing rate, ten times a sec, affects matters greatly but I don't
ever seem to get negative sample values even in arbitrary audio. The adc and
dac modules do work. IF I patch them together and record from the wave input
I get a delayed copy of the input in the output I guess this is the audio
equivalent of "cat".

2. Why is it that ChucK locks up when I try to process things at the rate of
1::samp? This is what I'd use in the working code but when I try that CPU
usage peaks and I have a hard time killing ChucK. Might ChucK be polling som
system wide keyboard hooks or something. I'm using a screen reader which I
run together with ChucK. I do know it has blocking hooks deep in the OS both
in keyboard handling and graphics i.e. GDI or DirectDraw. The OS is XP Pro
SP2 English and this laptop is an HP NX8220. The reader and magnifier is
Dolphin Supernova 7.03 beta 7.

3. Is there any way to decrease the latency on WIndows platforms for true
realtime MIDI playback and audio processing similar to Reaktor? DirectSound
latency is pretty bad as my audio machine, which isn't this laptop, would
have both WDM kernel streaming and ASIO support in it.

4. How does one use the ZeroX module for detecting zero crossings? What's
the value range and what is the output method of that module called i.e.
which thing should I poll to detect the crossings?

The manual says:

Quote:
Emits a single pulse at the the zero crossing in the direction of the zero
crossing
(see examples/zerox.ck)
End quote.

That path is slightly outdated, by the way. I found the example in:

.\examples\basic\zeroX.ck

The example uses the module patchd in the DAC directly, so I never see the
values it produces in processing.

5. Continuing with modules I've discovered that there are the low level
programming constructs, a few basic ugens and then fancy entities that would
be instruments in Reaktor speak and remind me of Korg's physical modelling.
Are additional basic modules going to be added in the future? Ideally, I'd
like, in addition to implementing the constructions as program code, use
ChucK as a more conventional modular synthesizer. Reaktor has got very very
generic modules in it stuff like digital gates and sequential circuits,
multiplexing and demultiplexing, comparators, waveshaping, variable delays,
lookup tables, step sequencers and so on. YOu get the picture. While it is
true that many of these things can be simulated easily and even more
flexibly in program code, that doesn't feel like programming a modular
synth, if you know what I mean. But then again, that might not be the aim of
ChucK. Maybe I could use ChucK itself to model some of those, if that's
possible, inheriting from a ugen that is. Is there example code for
something like that, whose primary focus is not the audio processing itself
as it often is in DSP? I'm not actually much good in higher math.

6. How do you use the array datatype in ChucK as a hash? I read that you can
index the array with strings much like a Perl hash, except that numbres and
strings are totally separate. If I get handed a hashtable whose keys I don't
know beforehand, is there a keys or each function for iterating through the
hash's keyset? Is the hash portion of the array unordered as one might
expect in an implementation e.g. an array of pointers to linked-lists?

Thanks for any help in advance.
And sorry about these complaints, I just thought ChucK would have been a
more complete piece of software than it currently is. Well, it has developed
loads since I read the manual last time and decidedd back then it isn't
quite ready yet, however. Now it is complete enough that I could imagine
using it as a Reaktor and VSt plug substitute for some things. Another bonus
is that it is available for LInux, so one of my friends whose into Linux and
music could use it, too. i've already hyped the prog. I'm much more
comfortable handling memory and doing processing in ChucK or Java than I am
in C++, for instance.
--
With kind regards Veli-Pekka Tätilä (***@mail.student.oulu.fi)
Accessibility, game music, synthesizers and programming:
http://www.student.oulu.fi/~vtatila/
joerg piringer
2007-03-04 13:41:36 UTC
Permalink
Post by Veli-Pekka Tätilä
Hi list,
I'm new here. I'm a 20 something sight-impaired programmer and musician from
Finland. I've got experience using analogs both virtual and real and know
how to build Reaktor patches as well as program in Java, Perl and C. Now
I've picked up the basics of ChucK and read through the PDF manual that
comes with the program. Looks good and certainly a great deal more
accessible than recent versions of Reaktor if you're using magnification and
a screen reader program. I've got a flurry of questions I'll ask here.
chuck version: 1.2.0.7b (dracula)
exe target: microsoft win32
1. What's the range of the audio float datatype and what would be the best
way to detect zero crossings? My aim is to write a simple app that counts
the samples in the low and high phases of a pulse wave and prints out the
pulse width whenever it changes. If I can also get MIDi input into the app
it would be quite easy to determine how pulse width in percentages changes
as the function of the pulse width knob in my virtual and real analogs,
whose exact values are not documented.
100::ms => dur blockSize; // Processing resolution.
until(adc.last() < 0) // Measure low-phase first.
blockSize => now;
0 => int positive;
0 => int negative;
while(true)
{
adc.last() => float sample;
if(sample > 0)
++positive;
else if(sample < 0)
{
++negative;
if(positive > 0) // Measured at least one cycle.
<<< "Width: ", 100 * negative / (negative + positive) >>>;
0 => positive => negative; // Reset counters.
} // else if
// Ignore the pure 0 value.
blockSize => now;
} // while
End code.
However, when I run this, the app doesn't ever seem to get past the until
loop. I'm assuming here that samples are floats or doubles from -1 to 1 as
in VST, as I didn't find the range in the manual. Is this correct? IF not,
it's no wonder the code won't work, <smile>. Of course the rather grainy
test processing rate, ten times a sec, affects matters greatly but I don't
ever seem to get negative sample values even in arbitrary audio. The adc and
dac modules do work. IF I patch them together and record from the wave input
I get a delayed copy of the input in the output I guess this is the audio
equivalent of "cat".
i didn't look too closely to your code but what you seem to have no
statement like:
adc a => blackhole;
or
adc a => dac d;
at the beginning. so in fact adc isn't working at all because it's not
in the ugen chain.

best
joerg
--
http://joerg.piringer.net
http://www.transacoustic-research.com
http://www.iftaf.org
http://www.vegetableorchestra.org/
Veli-Pekka Tätilä
2007-03-04 13:55:37 UTC
Permalink
Hi,
And thanks for an extremely quick reply. I'll snip myself here.
Post by joerg piringer
Post by Veli-Pekka Tätilä
chuck version: 1.2.0.7b (dracula)
exe target: microsoft win32
1. What's the range of the audio float datatype and what would be
the best way to detect zero crossings? My aim is to write a simple
app that counts the samples in the low and high phases of a pulse
wave and prints out the pulse width whenever it changes. If I can
also get MIDi input into the app it would be quite easy to determine
how pulse width in percentages changes as the function of the pulse
width knob in my virtual and real analogs, whose exact values are
not documented.
100::ms => dur blockSize; // Processing resolution.
until(adc.last() < 0) // Measure low-phase first.
blockSize => now;
0 => int positive;
0 => int negative;
while(true)
{
adc.last() => float sample;
if(sample > 0)
++positive;
else if(sample < 0)
{
++negative;
if(positive > 0) // Measured at least one cycle.
<<< "Width: ", 100 * negative / (negative + positive) >>>;
0 => positive => negative; // Reset counters.
} // else if
// Ignore the pure 0 value.
blockSize => now;
} // while
End code.
However, when I run this, the app doesn't ever seem to get past the
until loop. I'm assuming here that samples are floats or doubles
from -1 to 1 as in VST, as I didn't find the range in the manual. Is
this correct? IF not, it's no wonder the code won't work, <smile>.
Of course the rather grainy test processing rate, ten times a sec,
affects matters greatly but I don't ever seem to get negative sample
values even in arbitrary audio. The adc and dac modules do work. IF
I patch them together and record from the wave input I get a delayed
copy of the input in the output I guess this is the audio equivalent
of "cat".
i didn't look too closely to your code but what you seem to have no
adc a => blackhole;
or
adc a => dac d;
at the beginning. so in fact adc isn't working at all because it's not
in the ugen chain.
Ah I see, only chains leading to DAC are processed to save CPU cycles.
Somewhat counter-intuitive initially, in this case. I wish one could have
the ability to print out warnings about diagnostics related to disabledd
modules. Most GUi-enabled modular environments show it in some way.

Seems my query is still valid, though. I added:
adc => dac;
At the top of the script and now I get an echo effect because there's plenty
of latency between the input and output. It still doesn't get past the until
loop, probably some other common newbie goof.
--
With kind regards Veli-Pekka Tätilä (***@mail.student.oulu.fi)
Accessibility, game music, synthesizers and programming:
http://www.student.oulu.fi/~vtatila/
joerg piringer
2007-03-04 14:15:20 UTC
Permalink
Post by Veli-Pekka Tätilä
Hi,
And thanks for an extremely quick reply. I'll snip myself here.
Post by joerg piringer
Post by Veli-Pekka Tätilä
chuck version: 1.2.0.7b (dracula)
exe target: microsoft win32
1. What's the range of the audio float datatype and what would be
the best way to detect zero crossings? My aim is to write a simple
app that counts the samples in the low and high phases of a pulse
wave and prints out the pulse width whenever it changes. If I can
also get MIDi input into the app it would be quite easy to determine
how pulse width in percentages changes as the function of the pulse
width knob in my virtual and real analogs, whose exact values are
not documented.
100::ms => dur blockSize; // Processing resolution.
until(adc.last() < 0) // Measure low-phase first.
blockSize => now;
0 => int positive;
0 => int negative;
while(true)
{
adc.last() => float sample;
if(sample > 0)
++positive;
else if(sample < 0)
{
++negative;
if(positive > 0) // Measured at least one cycle.
<<< "Width: ", 100 * negative / (negative + positive) >>>;
0 => positive => negative; // Reset counters.
} // else if
// Ignore the pure 0 value.
blockSize => now;
} // while
End code.
However, when I run this, the app doesn't ever seem to get past the
until loop. I'm assuming here that samples are floats or doubles
from -1 to 1 as in VST, as I didn't find the range in the manual. Is
this correct? IF not, it's no wonder the code won't work, <smile>.
Of course the rather grainy test processing rate, ten times a sec,
affects matters greatly but I don't ever seem to get negative sample
values even in arbitrary audio. The adc and dac modules do work. IF
I patch them together and record from the wave input I get a delayed
copy of the input in the output I guess this is the audio equivalent
of "cat".
i didn't look too closely to your code but what you seem to have no
adc a => blackhole;
or
adc a => dac d;
at the beginning. so in fact adc isn't working at all because it's not
in the ugen chain.
Ah I see, only chains leading to DAC are processed to save CPU cycles.
Somewhat counter-intuitive initially, in this case. I wish one could have
the ability to print out warnings about diagnostics related to disabledd
modules. Most GUi-enabled modular environments show it in some way.
adc => dac;
At the top of the script and now I get an echo effect because there's plenty
of latency between the input and output. It still doesn't get past the until
loop, probably some other common newbie goof.
you can always use:
adc a => blackhole;
it routes the adc into nothing but still constructs a chain.

but maybe that's not the problem.

you can try to substitute your until-loop with the following:
until(adc.last() < 0) // Measure low-phase first.
{
blockSize => now;
<<< adc.last() >>>;
}

so it should print out the value of the adc. that could give you a hint...
in my case it was always zero... but that's because of my soundcard
settings on the laptop.

best
joerg
--
http://joerg.piringer.net
http://www.transacoustic-research.com
http://www.iftaf.org
http://www.vegetableorchestra.org/
Spencer Salazar
2007-03-04 15:01:31 UTC
Permalink
Post by Veli-Pekka Tätilä
Ah I see, only chains leading to DAC are processed to save CPU cycles.
Somewhat counter-intuitive initially, in this case. I wish one
could have
the ability to print out warnings about diagnostics related to
disabledd
modules. Most GUi-enabled modular environments show it in some way.
adc => dac;
At the top of the script and now I get an echo effect because
there's plenty
of latency between the input and output. It still doesn't get past the until
loop, probably some other common newbie goof.
I believe this a current bug with adc (definitely not a newbie
goof). My solution is something like this, although there may be
better ways:

adc => Gain g => blackhole;

until( g.last() < 0 )
blockSize => now;

Let us know if that doesnt work!

spencer
Veli-Pekka Tätilä
2007-03-04 17:20:03 UTC
Permalink
Post by Spencer Salazar
Post by Veli-Pekka Tätilä
Ah I see, only chains leading to DAC are processed to save CPU
cycles. Somewhat counter-intuitive initially, in this case. <snip>
adc => dac;
At the top of the script and now I get an echo effect because
there's plenty
of latency between the input and output. It still doesn't get past the until
loop, probably some other common newbie goof.
I believe this a current bug with adc (definitely not a newbie
goof). My solution is something like this, although there may be
adc => Gain g => blackhole;
until( g.last() < 0 )
blockSize => now;
Let us know if that doesnt work!
Ah it does, thanks for the correction. I think stuff like this should end up
in the manual, as reading from the ADC is a common needd anyway.

WEll, here's an improved version which seems to measure PWM OK:

adc => Gain g => blackhole;
1::samp => dur blockSize; // Processing resolution.
until(g.last() < 0) // Measure low-phase first.
blockSize => now;
0 => int negative => int positive;
50.0 => float width => float lastWidth;
while(true)
{
g.last() => float sample;
if(sample >= 0)
++positive;
else
{
if(positive > 0)
{ // Measured at least one cycle.
100 * negative / (negative + positive) => width;
if(width != lastWidth)
{ // Only print new widths.
<<< width, " %", negative, positive >>>;
width => lastWidth;
} // if
0 => positive => negative;
} // if
++negative; // Also the 1st sample of the next cycle.
} // else
blockSize => now;
} // while

I've also solved the mystery of the machine locking up when I run in a tight
loop of 1::samp. WHen the script gets noise as input, as it does if I don't
play anything, it tries to detect the noise pulse width too which may print
out new values on almost every sample. That means thousands of print calls a
sec. WIthout the Screen reader the machine doesn't really hault. But with
it, the poor app tries to magnify and speak each and every print, though
there's a bit of optimization, and thus leaves very little CPU time to other
processes including chucK. Quite logical, actually.

Is there a noies gate module? I should patch such a thing after the ADC to
avoid the above situation. THere's also probably some minor logic error or
else the waveforms I get in my synths don't cleanly match the sampling rate.
That is I'm testing the script with the pulse waves from my Roland JP-8080
and Waldorf Pulse. In both cases, I get minor fluctuations in the output
that happened pretty often. That is Sometimes the amount of negative or
positive samples varies by one, even though I play one long static note and
don't touch the width at all. Changing the pitch doesn't seem to eliminate
the problem either, though of course the sample counts are different. I
wonder what's wrong. Probably some classic off-by-one bug I've missed.

Could the above code be re-written using the ZeroX module? I still am a bit
unsure as to how it works exactly.
--
With kind regards Veli-Pekka Tätilä (***@mail.student.oulu.fi)
Accessibility, game music, synthesizers and programming:
http://www.student.oulu.fi/~vtatila/
Spencer Salazar
2007-03-04 19:16:45 UTC
Permalink
Hi Veli-Pekka,
Welcome to ChucK!
Post by Veli-Pekka Tätilä
1. What's the range of the audio float datatype and what would be the best
way to detect zero crossings?
[...]
I'm assuming here that samples are floats or doubles from -1 to 1 as
in VST, as I didn't find the range in the manual. Is this correct?
Yes. Although, the -1 to 1 part isn't enforced within a UGen graph,
so a system that is dealing with completely arbitrary input probably
shouldn't have that expectation.
Post by Veli-Pekka Tätilä
3. Is there any way to decrease the latency on WIndows platforms for true
realtime MIDI playback and audio processing similar to Reaktor? DirectSound
latency is pretty bad as my audio machine, which isn't this laptop, would
have both WDM kernel streaming and ASIO support in it.
You could try decreasing the audio buffer size through the --
bufsize### command line option (see http://chuck.cs.princeton.edu/doc/
program/vm.html for more information). However, please note that
smaller buffer sizes may cause audio to break up, depending on your
soundcard, audio driver, and OS.
Post by Veli-Pekka Tätilä
4. How does one use the ZeroX module for detecting zero crossings? What's
the value range and what is the output method of that module called i.e.
which thing should I poll to detect the crossings?
Emits a single pulse at the the zero crossing in the direction of the zero
crossing
Yes--this just means that for a zero crossing from > 1 to < 1 in its
input, it outputs a -1 for that particular sample, and for < 1 to >
1, it outputs a 1.
Post by Veli-Pekka Tätilä
6. How do you use the array datatype in ChucK as a hash? I read that you can
index the array with strings much like a Perl hash, except that numbres and
strings are totally separate. If I get handed a hashtable whose keys I don't
know beforehand, is there a keys or each function for iterating through the
hash's keyset? Is the hash portion of the array unordered as one might
expect in an implementation e.g. an array of pointers to linked-lists?
The hash table form of arrays is somewhat limited in available
operations at this time--its only possible to do assignment and
lookup, with keys that are known in advance. Clearly we'd like to
have some more advanced functionality built into hash tables in the
near future.

spencer
Veli-Pekka Tätilä
2007-03-04 19:44:27 UTC
Permalink
Post by Spencer Salazar
Post by Veli-Pekka Tätilä
1. What's the range of the audio float datatype and what would be the best
way to detect zero crossings?
[...]
I'm assuming here that samples are floats or doubles from -1 to 1 as
in VST, <snip>
Yes. Although, the -1 to 1 part isn't enforced within a UGen graph,
Ah I see, so the rest of the value range is available when needed. But I
imagine that if larger values end up in the DAC they will be hard clipped or
wrapped.
Post by Spencer Salazar
Post by Veli-Pekka Tätilä
3. Is there any way to decrease the latency on WIndows platforms
for tru realtime MIDI playback and audio processing similar to Reaktor?
You could try decreasing the audio buffer size through the --
bufsize### command line option (see http://chuck.cs.princeton.edu/doc/
program/vm.html for more information).
Will do that. HOwever I just read that ChucK is based on SDL and DirectSound
meaning a 20 ms latency is quite an achievement in many cards. It would be
great if the WIndows port of SDL had native ASIo support some day. That
would get rid of the latency issues with SDL apps in general.
Post by Spencer Salazar
Post by Veli-Pekka Tätilä
4. How does one use the ZeroX module for detecting zero crossings?
Emits a single pulse at the the zero crossing in the direction of the zero
crossing
Yes--this just means that for a zero crossing from > 1 to < 1 in its
input, it outputs a -1 for that particular sample, and for < 1 to >
1, it outputs a 1.
But what does it output when no zero-crossing is detected at the now time,
which is more than likely? I suppose that would be 0 and could test, of
course, but i'll be lazy and ask here.
Post by Spencer Salazar
Post by Veli-Pekka Tätilä
6. How do you use the array datatype in ChucK as a hash?
you can index the array with strings much like a Perl hash, except that
numbres and strings are totally separate. If I get handed a hashtable
whose
keys I don't know beforehand, is there a keys or each function for
iterating
through the hash's keyset?
The hash table form of arrays is somewhat limited in available
operations at this time--its only possible to do assignment and
lookup, with keys that are known in advance. Clearly we'd like to
have some more advanced functionality built into hash tables in the
near future.
Ah I see, in addition to keys and values the ability to reverse the keys and
values of a hash is sometimes very useful, though it only properly works if
the values of the hash also happen to be unique. Still the current
implementation can be used for light-weight struct:ish datatypes I'd imagine
similarly to how one would use hashes in Perl in constructing classes.

I also read that the OOP stuff is a bit on the way. The lack of public and
protected data doesn't bother me much but having constructor support would
rock. As well as the ability to declare abstract or interface classes
similar to interfaces in Java or pure virtual classes in C++.

Which reminds me of yet another query, I read somewhere that as ChucK also
supports multimedia, it would have OpenGl graphics support, too. Can I
access it in chucK itself and if so, where's the reference? The manual
doesn't say anything about openGl.
--
With kind regards Veli-Pekka Tätilä (***@mail.student.oulu.fi)
Accessibility, game music, synthesizers and programming:
http://www.student.oulu.fi/~vtatila/
Spencer Salazar
2007-03-05 15:47:57 UTC
Permalink
Howdy,
Post by Veli-Pekka Tätilä
Post by Spencer Salazar
Post by Veli-Pekka Tätilä
1. What's the range of the audio float datatype and what would be the best
way to detect zero crossings?
[...]
I'm assuming here that samples are floats or doubles from -1 to 1 as
in VST, <snip>
Yes. Although, the -1 to 1 part isn't enforced within a UGen graph,
Ah I see, so the rest of the value range is available when needed. But I
imagine that if larger values end up in the DAC they will be hard clipped or
wrapped.
This would be the expected behavior, yes, although empirically the
exact upper/lower bounds before clipping can vary across different
platforms or equipment.
Post by Veli-Pekka Tätilä
Post by Spencer Salazar
Post by Veli-Pekka Tätilä
3. Is there any way to decrease the latency on WIndows platforms
for tru realtime MIDI playback and audio processing similar to Reaktor?
You could try decreasing the audio buffer size through the --
bufsize### command line option (see http://chuck.cs.princeton.edu/
doc/
program/vm.html for more information).
Will do that. HOwever I just read that ChucK is based on SDL and DirectSound
meaning a 20 ms latency is quite an achievement in many cards. It would be
great if the WIndows port of SDL had native ASIo support some day. That
would get rid of the latency issues with SDL apps in general.
ChucK does use DirectSound on Windows, which indeed makes attaining
realtime latency problematic. ChucK isn't based on SDL, but rather
on RTAudio, which does have ASIO support, though ChucK doesn't
support that backend currently. So, ASIO is slowly making its way
into ChucK, but we also don't have an ASIO platform to test on,
making the process more difficult.
Post by Veli-Pekka Tätilä
Post by Spencer Salazar
Post by Veli-Pekka Tätilä
4. How does one use the ZeroX module for detecting zero crossings?
Emits a single pulse at the the zero crossing in the direction of the zero
crossing
Yes--this just means that for a zero crossing from > 1 to < 1 in its
input, it outputs a -1 for that particular sample, and for < 1 to >
1, it outputs a 1.
But what does it output when no zero-crossing is detected at the now time,
which is more than likely? I suppose that would be 0 and could
test, of
course, but i'll be lazy and ask here.
Ah. Yes, just 0.
Post by Veli-Pekka Tätilä
I also read that the OOP stuff is a bit on the way. The lack of public and
protected data doesn't bother me much but having constructor
support would
rock. As well as the ability to declare abstract or interface classes
similar to interfaces in Java or pure virtual classes in C++.
Yes--real constructors are definitely on the todo list, and ChucK
actually reserves the "interface" and "implements" keywords for
potential future use.
Post by Veli-Pekka Tätilä
Which reminds me of yet another query, I read somewhere that as ChucK also
supports multimedia, it would have OpenGl graphics support, too. Can I
access it in chucK itself and if so, where's the reference? The manual
doesn't say anything about openGl.
OpenGL support (GLucK) is currently in development limbo, but its
something we'd very much like to make available in the near future.

spencer
joerg piringer
2007-03-05 18:11:52 UTC
Permalink
Post by Spencer Salazar
Post by Veli-Pekka Tätilä
Post by Spencer Salazar
Post by Veli-Pekka Tätilä
3. Is there any way to decrease the latency on WIndows platforms
for tru realtime MIDI playback and audio processing similar to Reaktor?
You could try decreasing the audio buffer size through the --
bufsize### command line option (see http://chuck.cs.princeton.edu/
doc/
program/vm.html for more information).
Will do that. HOwever I just read that ChucK is based on SDL and DirectSound
meaning a 20 ms latency is quite an achievement in many cards. It would be
great if the WIndows port of SDL had native ASIo support some day. That
would get rid of the latency issues with SDL apps in general.
ChucK does use DirectSound on Windows, which indeed makes attaining
realtime latency problematic. ChucK isn't based on SDL, but rather
on RTAudio, which does have ASIO support, though ChucK doesn't
support that backend currently. So, ASIO is slowly making its way
into ChucK, but we also don't have an ASIO platform to test on,
making the process more difficult.
you could use asio4all
http://www.asio4all.com/
is supposed to be working with all soundcards
Post by Spencer Salazar
Post by Veli-Pekka Tätilä
Which reminds me of yet another query, I read somewhere that as ChucK also
supports multimedia, it would have OpenGl graphics support, too. Can I
access it in chucK itself and if so, where's the reference? The manual
doesn't say anything about openGl.
OpenGL support (GLucK) is currently in development limbo, but its
something we'd very much like to make available in the near future.
how about the possibilities to write extensions, so we could write our
own? i know there's some support in the code but i didn't figure out how
to use it. any hints?

best
joerg
--
http://joerg.piringer.net
http://www.transacoustic-research.com
http://www.iftaf.org
http://www.vegetableorchestra.org/
Veli-Pekka Tätilä
2007-03-05 18:48:02 UTC
Permalink
Post by Spencer Salazar
Post by Veli-Pekka Tätilä
Post by Veli-Pekka Tätilä
3. Is there any way to decrease the latency on WIndows platforms
for tru realtime MIDI playback and audio processing similar to Reaktor?
just read that ChucK is based on SDL and
DirectSound
meaning a 20 ms latency is quite an achievement in many cards.
would be great if the WIndows port of SDL had native ASIo support
ChucK does use DirectSound on Windows, which indeed makes attaining
realtime latency problematic.
Yeah, for all the hassle and worries about legacy sound APIs like OSS and
hardware mixing, this is about the first time I'm envious of Linux users for
their sound support, <smile>. That is to say ALSA does give low latency if
you have the drivers, although incidentally I've heard the TerraTec drivers
for LInux have some extra latency issues, oh well.

Quoting joerg here:
you could use asio4all
http://www.asio4all.com/
is supposed to be working with all soundcards
End quote.

Yeah, that works fine for SoundMAX cards such as the one in this laptop. But
the problem is not the lack of low-latency ASIO but DirectSound itself. Does
anyone know how apps like Sonar can achieve low latency by using MS
technology? I've heard the names WDM and kernel streaming but neither says
much to me as I'm not a driver developer.
Post by Spencer Salazar
Post by Veli-Pekka Tätilä
having constructor support would rock.
As well as the ability to declare abstract or interface classes
Yes--real constructors are definitely on the todo list, and ChucK
actually reserves the "interface" and "implements" keywords for
potential future use.
Ah, nice. Speaking of future extensions, does this snippet in the ChucK
manual mean I can actually already load code on the fly much like using a
DLL or a shared library? That is:

Quote:
ChucK Standard Libraries API these libraries are provide by default with
ChucK - new ones can
also be imported with ChucK dynamic linking (soon to be documented...).
End quote.

I wonder when this will be, in fact, documented.
Post by Spencer Salazar
OpenGL support (GLucK) is currently in development limbo, but its
something we'd very much like to make available in the near future.
I see, which reminds me of another thing entirely. Will ChucK have some day
in the future a sufficiently powerful, cross-platform toolkit for
effortllessly building synth panels as in Reaktor? If it would also support
presets, MIDi learn, respect user color and font preferences and would be
keyboard and screen reader accessible, I'd be chucKed into virtual analog
heaven then, <grin>.

Here are some thoughts as to what might need to be implemented:.
Coarse value adjustements: knobs and sliders with a keybord interface
modelled on sliders.
Exact value adjustements: text entry boxes and spinners
Toggle buttons: much like small push buttons
Radio buttons: AKA switches in synth speak, again the kbd interface could be
lifted from a radio group
output: analog meters and read-only edit boxes AKA read-only displays
Not to mention groups for putting controls into. This is about the controls
early Reaktor's had and was quite sufficient for a long time. If the native
controls could be used, I wonder if WX WIdgets would be sufficiently
portable and powrful.
Kassen
2007-03-05 20:43:26 UTC
Permalink
Veli-Pekka;


I see, which reminds me of another thing entirely. Will ChucK have some day
Post by Veli-Pekka Tätilä
in the future a sufficiently powerful, cross-platform toolkit for
effortllessly building synth panels as in Reaktor?
I think the mini-audicle user-interface element thingies that are there on
Mac right now and will be here for Win and Linux sometime soon-ish would
come quite close to something like that? I'm not a 100% certain about what
Reactor does and doesn't do there but I imagine that the more advanced and
configurable it gets the less "effortless" it would be...



If it would also support
Post by Veli-Pekka Tätilä
presets,
As soon as we get real file in-out you'll be able to make your own preset
system!


MIDi learn,


You could already implement your own MIDI learn function for ChucK if you'd
like to...
Post by Veli-Pekka Tätilä
respect user color and font preferences
I don't think that would be one of the harder things to get but it might
break some cross-platform compatibility if people start referring to fonts
that might belong to a specific OS in their ChucK files.


and would be
Post by Veli-Pekka Tätilä
keyboard and screen reader accessible, I'd be chucKed into virtual analog
heaven then, <grin>.
Screen readers are aids for people with bad or no eyesight, right? That
might be a interesting concept. It could be argued/ studied that programing
languages as a musical interface might get around some of the limitations
for visually impaired people caused by the recent developments in graphical
interfaces. Might be a interesting point for Ge's research? Personally I'm
very skeptical about the benefits of all those fancy graphics commercial
synth programs for people with perfect sight as well; it seems to distract
from the sound.

Keyboard reading we have already, by the way, I'd say you are getting quite
close to your heaven!

Yours,
Kas.
Veli-Pekka Tätilä
2007-03-06 11:14:48 UTC
Permalink
Kassen wrote:
[UI for synth panels]
Post by Kassen
Post by Veli-Pekka Tätilä
in the future a sufficiently powerful, cross-platform toolkit for
effortllessly building synth panels as in Reaktor?
I think the mini-audicle user-interface element thingies that are there on
Mac right now and will be here for Win and Linux sometime soon-ish would
come quite close to something like that?
Cool, I've only seen references to that Mini Audicle thing and tried out
some binary in WIndows. Itr looks to me like a ChucK IDE, though I've yet to
figure out how to run scripts in it, <embarrassed emoticon>.
Post by Kassen
Reactor does and doesn't do there but I imagine that the more advanced and
configurable it gets the less "effortless" it would be...
Yes, if these user interfaces are built programmatically, it would be quite
nice if the app could auto-layout panel controls for you. Alternatively,
I've grown pretty fond of the way how Gnome apps are able to build complex
layouts by nesting panels that layout their elements horizontally or
vertically.
Post by Kassen
Post by Veli-Pekka Tätilä
If it would also support presets,
As soon as we get real file in-out you'll be able to make your own preset
system!
That's right. Although I was thinking the panel objects could maybe export
their serialized state. That info could be then stored in a file directly.
Do chucK objects have built-in serialization capabilities as in Java? Or
maybe a class to inherit from and a bunch of callbacks for preset
management, kind of like in VSt plugs.
Post by Kassen
MIDi learn,>You could already implement your own MIDI learn function for
ChucK if you'd
like to...
Yes, although I'd have to ask the user which parameter should be learned, if
there are no on-screen controls to interact with. Is there any good way to
do string input in chucK? There's only simple unformatted debug printing in
the examples.
Post by Kassen
Post by Veli-Pekka Tätilä
respect user color and font preferences
I don't think that would be one of the harder things to get but it might
break some cross-platform compatibility
Not necessarily. I didn't mean the GUI would be customizable in the app
itself. In stead, it would use the colors and fonts you had picked for your
OS. I'm a low-vision user myself so I've chosen Windows colors that are
sufficiently high contrast but virtually no VST plug ever respects those,
for example.

[accessibility]
Post by Kassen
Post by Veli-Pekka Tätilä
keyboard and screen reader accessible, I'd be chucKed into virtual analog
heaven then, <grin>.
Screen readers are aids for people with bad or no eyesight, right?
Basically yes. My rather technical definition is that they are apps that
programmatically reduce the graphical user interface to text, which is then
rendered as speech and or Braille. Other things they often do include
following focus changes, heuristically guessing which pieces of text are
labels and augmenting the keyboard interface by providing mouse-emulation
and virtual focus for apps that either lack a keyboard interface or have a
broken tab-order.
Post by Kassen
languages as a musical interface might get around some of the limitations
for visually impaired people caused by the recent developments in graphical
interfaces.
It's not so much the interfaces as such, but the fact that they are custom
controls screne readers are not able to read programmatically. They also
lack keybord access unlike say your average Office app. I'm actually a fan
of GUIs myself and wouldn't want to go back to a command prompt, though I
like programming. It's just GUis that don't cleanly reduce to text that are
the problem. Usually direct manipulation with the mouse, graphs and such. A
graphically editable envelope would be a good example, though again there's
no reason why the points could not be managed via a list view and some
buttons.

Here's a quick experiment, if you'd like to know how using a basic screen
reader is like:
Hit windows+r for the run box and type in Narrator in it. If you are in OS X
launch VoiceOver and if it might be Linux run GNome and type in Orca in the
run box. Then turn off your monitor, unplug the mouse and starrt using the
computer relying on keyboard and speech.
Post by Kassen
synth programs for people with perfect sight as well; it seems to distract
from the sound.
Not to mention all the people with high res TFT displays, glasses and or
aging.

[HID API]
Post by Kassen
Keyboard reading we have already
Yup, I've noticed. Although I think I found a bug in it. In:
.\chuck-1.2.0.7-exe\examples\hid\keyboard-organ.ck
The keystrokes get delivered to chucK even when that window has not gotten
the focus. I find that counter-intuittive and a potential problem if you
multi-task several keyboard instruments in different consoles. Sure it is
expected behavior for MIDI but not for the keyboard in Windows apps at
least. Besides such an implementation likely means system-wide Windows
keyboard hooks may need to be used, which can be a resource hog at times. My
screen reader already has its hooks in the chain.

How good is the HID API by the way? I've got a gamepad with a mini joystick,
a v axis, the throttle, pov switch and 6 buttons and hav always thought it
would be cool if I got it to send out MIDI. Also this laptop has a Synaptics
touchpad that can send pressure in addition to the xy coords. So I guess one
could use it as a 3D mini Khaos pad if you will.
--
With kind regards Veli-Pekka Tätilä (***@mail.student.oulu.fi)
Accessibility, game music, synthesizers and programming:
http://www.student.oulu.fi/~vtatila/
Kassen
2007-03-06 19:38:15 UTC
Permalink
Hi, Velli-Pekka! Wonderfully interesting points!
Post by Veli-Pekka Tätilä
[UI for synth panels]
Cool, I've only seen references to that Mini Audicle thing and tried out
some binary in WIndows. Itr looks to me like a ChucK IDE, though I've yet to
figure out how to run scripts in it, <embarrassed emoticon>.
Ah, yes. I'm on Win and Linux as well but my eyes still work (though they
are going down....) so I saw the screenshot of the Mac version. This
screenshot shows sliders and leds and so on and the idea is of course that
this will be extended. Making those screen-readable now seems like a good
idea, especially since I imagine that in the long run more visually impaired
people will try programing for their electronic music needs, the idea seems
very natural.


That's right. Although I was thinking the panel objects could maybe export
Post by Veli-Pekka Tätilä
their serialized state. That info could be then stored in a file directly.
Do chucK objects have built-in serialization capabilities as in Java? Or
maybe a class to inherit from and a bunch of callbacks for preset
management, kind of like in VSt plugs.
Right, yes. I think that here Spencer's point of view is close to my own. As
I understood it Spencer wants to integrate such things with ChucK itself,
not make it into a exception. I think that this would make it more versatile
and powerful in the long run. For novice users it might be harder to use at
first but that trade-off might be worth it. I think we can assume that
anyone who is using it will be already familiar with implementing
functionality using ChucK. The same goes for MIDI learn. You are quite right
that MIDI learn needs a way to indicate what parameter is meant and right
now -without a graphical interface- that would be a rather primitive affair.
As soon as the graphics get here it would need a night of clever thinking
but be very possible.

This might be a good topic for a future example to be distributed with the
Mini.



Not necessarily. I didn't mean the GUI would be customizable in the app
Post by Veli-Pekka Tätilä
itself. In stead, it would use the colors and fonts you had picked for your
OS. I'm a low-vision user myself so I've chosen Windows colors that are
sufficiently high contrast but virtually no VST plug ever respects those,
for example.
Ah! I think that I myself run one of the Windows settings meant for disabled
people; I use one of the high-contrast B&W ones because I find them easier
on the eye and more clear.


[accessibility]
Post by Veli-Pekka Tätilä
Basically yes. My rather technical definition is that they are apps that
programmatically reduce the graphical user interface to text, which is then
rendered as speech and or Braille. Other things they often do include
following focus changes, heuristically guessing which pieces of text are
labels and augmenting the keyboard interface by providing mouse-emulation
and virtual focus for apps that either lack a keyboard interface or have a
broken tab-order.
Got it. This is the sort of thing why we need alt tags for images and Flash
is of dubious value and so on.



It's not so much the interfaces as such, but the fact that they are custom
Post by Veli-Pekka Tätilä
controls screne readers are not able to read programmatically. They also
lack keybord access unlike say your average Office app. I'm actually a fan
of GUIs myself and wouldn't want to go back to a command prompt, though I
like programming. It's just GUis that don't cleanly reduce to text that are
the problem. Usually direct manipulation with the mouse, graphs and such. A
graphically editable envelope would be a good example, though again there's
no reason why the points could not be managed via a list view and some
buttons.
Absolutely. Now that I think of it it's quite remarkable that nobody -that I
know of- pointed out the interesting benefits that livecoding and generally
using programing as a musical instrument might have for the visually
impaired.
Post by Veli-Pekka Tätilä
Post by Kassen
synth programs for people with perfect sight as well; it seems to
distract
Post by Kassen
from the sound.
Not to mention all the people with high res TFT displays, glasses and or
aging.
Oh, yes, and the need for split-second decisions while in chaotic
environments like when performing in a nightclub.



[HID API]
Post by Veli-Pekka Tätilä
.\chuck-1.2.0.7-exe\examples\hid\keyboard-organ.ck
The keystrokes get delivered to chucK even when that window has not gotten
the focus. I find that counter-intuittive and a potential problem if you
multi-task several keyboard instruments in different consoles. Sure it is
expected behavior for MIDI but not for the keyboard in Windows apps at
least. Besides such an implementation likely means system-wide Windows
keyboard hooks may need to be used, which can be a resource hog at times. My
screen reader already has its hooks in the chain.
Personally I like this. Right now I work with ChucK and the Nord Modular and
I find it very convenient to be able to play ChucK with the keyboard while
the Nord editor is in focus and addressed by the mouse. If this isn't what
you want you might want to try the non-hid keyboard interface; that one
doesn't do this and demands focus. You would lose key up and key down
messages though.



How good is the HID API by the way? I've got a gamepad with a mini joystick,
Post by Veli-Pekka Tätilä
a v axis, the throttle, pov switch and 6 buttons and hav always thought it
would be cool if I got it to send out MIDI. Also this laptop has a Synaptics
touchpad that can send pressure in addition to the xy coords. So I guess one
could use it as a 3D mini Khaos pad if you will.
I have some good news for you. As I mentioned; my eyes are fine for normal
computer usage but I got tired of staring at the screen all the time in a
performance context so I'm developing my own instrument (a house style
sequencer) which is build around two HID gaming devices and is playable
without using the screen at all. I still use screen prints to know what BPM
I'm currently at and how much shuffle I'm using but only to help me make
"presets". This is very feasible and I'd like to encourage you to try to do
the same. A good pianist can play blindfolded so why not a techno producer?
<provocative grin>.

One thing that I've become very interested in is interface sonification for
musical programs. I'm in the very early stages of testing that idea. My aim
here is to stylise the interface sonification in such a way that it blends
with the music, using some quantisation where it may be necessary. You might
want to experiment in that direction as well. As Asio gets here this will
feel more natural; realtime interface sonification depends heavily on low
latency if you want to make it pleasing. I found that I tend to input
commands rhythmically after listening to a beat for a while regardless of
whether those commands have anything to do with the beat directly.


Yours,
Kas.
Spencer Salazar
2007-03-06 22:56:09 UTC
Permalink
Post by Veli-Pekka Tätilä
[UI for synth panels]
Post by Kassen
Post by Veli-Pekka Tätilä
in the future a sufficiently powerful, cross-platform toolkit for
effortllessly building synth panels as in Reaktor?
I think the mini-audicle user-interface element thingies that are there on
Mac right now and will be here for Win and Linux sometime soon-ish would
come quite close to something like that?
Cool, I've only seen references to that Mini Audicle thing and
tried out
some binary in WIndows. Itr looks to me like a ChucK IDE, though I've yet to
figure out how to run scripts in it, <embarrassed emoticon>.
Its easy! (hopefully...) You just have to type or paste some ChucK
code into the editor window, start the virtual machine (Alt .), and
then add the shred with Alt + . Once the virtual machine is active,
you can also use the buttons at the top of the document window to
add, remove, and replace shreds.

We hope to soon improve the usefulness of miniAudicle by writing some
decent documentation.
Post by Veli-Pekka Tätilä
Post by Kassen
Reactor does and doesn't do there but I imagine that the more
advanced and
configurable it gets the less "effortless" it would be...
Yes, if these user interfaces are built programmatically, it would be quite
nice if the app could auto-layout panel controls for you.
Alternatively,
I've grown pretty fond of the way how Gnome apps are able to build complex
layouts by nesting panels that layout their elements horizontally or
vertically.
This sort of auto-layout functionality is planned, but for now manual
layout is/will be required.
Post by Veli-Pekka Tätilä
[HID API]
Post by Kassen
Keyboard reading we have already
.\chuck-1.2.0.7-exe\examples\hid\keyboard-organ.ck
The keystrokes get delivered to chucK even when that window has not gotten
the focus. I find that counter-intuittive and a potential problem if you
multi-task several keyboard instruments in different consoles. Sure it is
expected behavior for MIDI but not for the keyboard in Windows apps at
least. Besides such an implementation likely means system-wide Windows
keyboard hooks may need to be used, which can be a resource hog at times. My
screen reader already has its hooks in the chain.
This is actually a feature, not a bug. The idea is to abstract the
keyboard as just some device with buttons, rather than a producer of
sequential character input. Fortunately this approach doesn't
require much in the way of system-wide hooks, as it simply uses
DirectInput.

As Kassen mentioned, there is the similar KBHit class available,
which does require the ChucK window to have the focus and may be
better suited to your preferred usage. There are a few other
differences in semantics, such as key repeating when a key is held.
See examples/event/kb.ck for an example.
Post by Veli-Pekka Tätilä
How good is the HID API by the way? I've got a gamepad with a mini joystick,
a v axis, the throttle, pov switch and 6 buttons and hav always thought it
would be cool if I got it to send out MIDI. Also this laptop has a Synaptics
touchpad that can send pressure in addition to the xy coords. So I guess one
could use it as a 3D mini Khaos pad if you will.
As Kassen said, its definitely possible to use gamepads (see the hid
examples), but ChucK probably won't be able to read the pressure of
your touchpad at this point.

thanks for your comments!
spencer
Veli-Pekka Tätilä
2007-03-07 13:18:48 UTC
Permalink
Hi Kas and Spencer,
As I wanted to comment both of your replies at once, I'm switching quoting
style in mid-thread, IF this new one is problematic, let me know in a reply
and I'l'll change back. AT least it makes it clear with screen readers who
is saying what without having to parse "greater than x 3" in my head,
<smile>. Feel free to snip heavily.

V, S, K = Veli-Pekka, Spencer, Kas

[UI for synth panels]
[mini-audicle]
VV: I've yet to figure out how to run scripts in it, <embarrassed emoticon>.
S: Its easy! (hopefully...) You just have to type or paste some ChucK
code into the editor window, start the virtual machine (Alt .), and then
add the shred with Alt + . Once the virtual machine is active,
V: Yes, that works, thanks. WHIch reminds me of a problem I found with the
organ, more about it in the HID section. The step I failed to do here was to
start the machine, I thought one could add a shread to it which would
implicitly start it, or something like that. I actually tried Audicle before
the manual so that explains things, too.

S: you can also use the buttons at the top of the document window to add,
remove, and replace shreds.
V: Ah now that you mentioned them, I used virtual focus and found it. Here
the problem is that they are toolbar buttons and in WIndows toolbars are not
in the tab order.
g***@columbia.edu
2007-03-07 16:33:02 UTC
Permalink
Hey chuCK-sters --

One of the things I would argue for in terms of future UI
development of ChUcK is to really chase after the idea of it
becoming an 'imbeddable' library. People have different UI
environments that are more or less congenial to work within, and
the "let's bring it all into our OWN world!" just reeks of all that
high-modernist totalizing that I hoped we were smacking down into a
happy multicultural world. Software imitates life! Art! All
that!

One of the first things I hoped could happen would be a way to
dynamically change parameters in an executing cHuCK script. I
would love to have a ugen that would poll (or be notified) by the
an 'external' environment that a new value was ready for
incorporation into the chUcK VM, and then that value would be
changed.

For example, when I did the max/msp [chuck~] object, I would love to
have had a way to set up a max/msp UI object to send values into a
cHUck script like this:

Maxconnection thing => feedback.gain => something.else;

and have the executing script pick up any incoming values from the
max/msp UI object dynamically.

brad
http://music.columbia.edu/~brad
Kassen
2007-03-07 21:02:28 UTC
Permalink
Post by g***@columbia.edu
One of the first things I hoped could happen would be a way to
dynamically change parameters in an executing cHuCK script. I
would love to have a ugen that would poll (or be notified) by the
an 'external' environment that a new value was ready for
incorporation into the chUcK VM, and then that value would be
changed.
I'm not sure I understand how this would be different from OSC or MIDI.

It sounds to me like you could use OSC and get that IXIsoftware
Python-OSC-graphical_thingy and have a open general purpose interface of
your own to controll ChucK, Max and a wireless robot-toaster in exactly the
way that you propose but I might (likely?) be missing something?


Kas.
g***@columbia.edu
2007-03-07 21:10:16 UTC
Permalink
Post by Kassen
I'm not sure I understand how this would be different from OSC or MIDI.
It's kind of my whole point, in that it is a lot different (at least
for me). This is from a draft article for Cycling 74 I'm just
finishing:

--------------
I'd like to make two final comments. The first is a plea to
developers,
and the second is a goad to Max/MSP users. For developers: think
shared libraries and loadable bundles! The Soundflower routing for
SC3 above works, but I would much prefer to access SC3
functionality directly from
within Max/MSP. Or (for that matter) I wouldn't mind being able
to access all of Max/MSP within SC3. The basic idea is that I like
the tight-coupling that can be gained through the imbedding of one
application directly inside another. I've used network approaches
in other instances in the past, and although they work well for most
situations, the ability to integrate control very closely can
lead to much better standalone interfaces and applications.
I would love to see the day when bringing in an external
application is a simple matter of bundling it properly, finding
the right entry-points, and then using the application completely
inside an alternative development environment.
--------------

OSC drives me crazy, by the way.

brad
http://music.columbia.edu/~brad
Kassen
2007-03-07 21:17:14 UTC
Permalink
Post by g***@columbia.edu
It's kind of my whole point, in that it is a lot different (at least
for me).
I see now, didn't quite get all of this from your original mail.

That sounds like something in the direction of SC3's client-server model
with open communications between both. I imagine that if ChucK and MAX would
also start doing things like that we would end up with a abstraction layer
not unlike Xserver on *nix where you can pick your own interface and still
deal with the same things, do I understand that correctly?

That would be quite a thing!

Kas.
g***@columbia.edu
2007-03-07 21:33:13 UTC
Permalink
Post by Kassen
That sounds like something in the direction of SC3's
client-server model
with open communications between both. I imagine that if ChucK
and MAX would
also start doing things like that we would end up with a
abstraction layer
not unlike Xserver on *nix where you can pick your own interface and still
deal with the same things, do I understand that correctly?
Not sure about that -- been awhile since I've done much X stuff.

It's more from a developer standpoint. I really like the idea of
being able to do something like this, for example:

main()
{
ChucK *chucker = new Chuck();

chucker->doStuff();

...

chucker->changeThings();

...

chucker->takeOverZeeVorld();

}

and just compile-in the chuck lib. Within whatever app your are
developing. No messing with net setup, etc.

brad
http://music.columbia.edu/~brad
Kassen
2007-03-07 21:47:54 UTC
Permalink
Post by g***@columbia.edu
and just compile-in the chuck lib. Within whatever app your are
developing. No messing with net setup, etc.
Now I get it. And this whole thing could then be made to be a part of our
own program and people would make a ChucKvst and firechuck (firefox plugin)
and it would get embedded in freeware computer games (shmuck!) and we would
all be very happy?
g***@columbia.edu
2007-03-07 22:06:56 UTC
Permalink
and we would all be very happy?
oh so very very happy! :-)

And the world would be a better place...

brad
http://music.columbia.edu/~brad
Scott Wheeler
2007-03-08 11:21:02 UTC
Permalink
Post by Kassen
Now I get it. And this whole thing could then be made to be a part of
our own program and people would make a ChucKvst and firechuck
(firefox plugin) and it would get embedded in freeware computer games
(shmuck!) and we would all be very happy?
One thing for the authors to consider is if things start moving in this
direction if they'd prefer to keep the GPL license or move to something
like the LGPL which permits a couple of the examples above. At the
moment doing a VST adapter or using ChucK in non-OSS freeware are not
permitted by the license. Honestly, this is probably also problematic
for chuck~.

As for integration in general, yes, it is something that's important not
just from an application developer's perspective, but from a musicians.
I'm currently using a mix of Live, ChucK and VST plugins over a series
of virtual audio and midi cables, but that makes restoring my
environment for a given piece / performance rather complicated. I've
thought about writing a dummy VST plugin that just sets up the right
connections and calls the ChucK executable with a given ChucK file.

-Scott
isjtar
2007-03-08 11:34:45 UTC
Permalink
Op 8-mrt-07, om 12:21 h
... using ChucK in non-OSS freeware are not
permitted by the license. Honestly, this is probably also problematic
for chuck~.
are you sure about that? general consensus in the maxworld seems to
be that the gpl for externals is not a problem because max is an
interpretor and doesn't not have to be free.
this complies with what stallman says in prefering the gpl over lgpl.
best

isjtar
isjtar
2007-03-08 11:37:02 UTC
Permalink
and the same for maxpatches btw
Post by isjtar
Op 8-mrt-07, om 12:21 h
... using ChucK in non-OSS freeware are not
permitted by the license. Honestly, this is probably also
problematic
for chuck~.
are you sure about that? general consensus in the maxworld seems to
be that the gpl for externals is not a problem because max is an
interpretor and doesn't not have to be free.
this complies with what stallman says in prefering the gpl over lgpl.
best
isjtar
_______________________________________________
chuck-users mailing list
https://lists.cs.princeton.edu/mailman/listinfo/chuck-users
Scott Wheeler
2007-03-08 11:53:10 UTC
Permalink
Post by isjtar
Op 8-mrt-07, om 12:21 h
are you sure about that? general consensus in the maxworld seems to
be that the gpl for externals is not a problem because max is an
interpretor and doesn't not have to be free.
No, I'm not sure. I've never looked at the license for the Cycling '74
SDK and I don't know if there's anything that has to be linked to or a
rather small interface that just has to be implemented. But the
interpreter point is irrelevant since the externals aren't written in
Max. Max patches being GPL'ed is something rather different.

I don't expect Ge & Friends are going to sick hordes of lawyers on Brad
anytime soon, but just wanted to raise the issue as something for the
ChucK to clarify at some point. And there are some things that I've
toyed around with that wouldn't be possible with the current licensing.

Note: I'm not trying to discourage usage of the GPL, but there are
advantages and disadvantages to each of the different flavors of OSS
licenses.

-Scott (who decided to start dual licensing his stuff GPL/MPL :-) )
isjtar
2007-03-08 12:15:17 UTC
Permalink
Post by Veli-Pekka Tätilä
But the
interpreter point is irrelevant since the externals aren't written in
Max.
yes, you're right, only thought about that afterwards.
g***@columbia.edu
2007-03-08 13:47:31 UTC
Permalink
Post by Scott Wheeler
I don't expect Ge & Friends are going to sick hordes of lawyers on Brad
anytime soon,
I have photos. I have videos. They wouldn't dare.

:-)

brad
http://music.columbia.edu/~brad
g***@columbia.edu
2007-03-08 13:46:20 UTC
Permalink
Post by Scott Wheeler
I've
thought about writing a dummy VST plugin that just sets up the
right
connections and calls the ChucK executable with a given ChucK
file.
I've made some pluggo plugins w/ ChUck that work well using
[chuck~]. Fun stuff!

I don't know much about the max/msp policy, but I think Cycling
imagines it as a development environment and standalones/plugins
are not covered by any Cycling rights-claims. There is a weirdness
with Windows, if anyone is interested I recall some discussion in
the archives.

Also, apologies for not updating [chuck~] recently. As soon as the
new version of cHUck gets set I'll get a new one out. Life
intervenes...

brad
http://music.columbia.edu/~brad
Scott Wheeler
2007-03-08 14:07:54 UTC
Permalink
Post by g***@columbia.edu
I don't know much about the max/msp policy, but I think Cycling
imagines it as a development environment and standalones/plugins
are not covered by any Cycling rights-claims. There is a weirdness
with Windows, if anyone is interested I recall some discussion in
the archives.
I just grabbed the SDK and took a peek. It is definitely GPL
incompatible as it requires linking to a non-OSS library. This isn't so
much about the Max/MSP SDK license as the ChucK one.

See:

http://www.gnu.org/licenses/gpl-faq.html#MereAggregation

As such, chuck~ violates ChucK's license, not Cycling '74's. Again, I
doubt the ChucK authors are deeply offended, but it'd be nice if their
intent is to allow such things that such were made explicit. There are
a number of ways to do that (the GPL provides a mechanism for
exceptions, there are also popular more permissive licenses, the popular
variations, in order of permissiveness being BSD, MPL and LGPL.).

-Scott
Steffen
2007-03-08 20:43:46 UTC
Permalink
Post by Scott Wheeler
I just grabbed the SDK and took a peek. It is definitely GPL
incompatible as it requires linking to a non-OSS library. This isn't so
much about the Max/MSP SDK license as the ChucK one.
http://www.gnu.org/licenses/gpl-faq.html#MereAggregation
As such, chuck~ violates ChucK's license, not Cycling '74's.
Then the consequence is, that no Max/MSP external can be GPL since
they all link to GPL incompatible Max/MSP. Eh?
Scott Wheeler
2007-03-09 00:02:33 UTC
Permalink
Post by Steffen
Then the consequence is, that no Max/MSP external can be GPL since
they all link to GPL incompatible Max/MSP. Eh?
Yes, basically. There is a distinction that if an external is created
specifically for the Max/MSP SDK and does not use code from other GPL'ed
projects, then there is an (arguable) implicit exception. But you can't
(legally) take code from a third party GPL'ed project and use that
either directly or as a library in a Max/MSP external. This naturally
happens, but mostly because people aren't actually reading the license
that they're using.

-Scott
martin!
2007-03-09 14:52:12 UTC
Permalink
Post by Steffen
Then the consequence is, that no Max/MSP external can be GPL since
they all link to GPL incompatible Max/MSP. Eh?
I'm fairly certain this is the case. Others projects have encountered
this problem. Because of similar licensing issues, VST support in
Audacity must be packaged separately and under a different license.

See: http://audacityteam.org/vst/

martin robinson
isjtar
2007-03-08 15:46:45 UTC
Permalink
yes, the windows version has some 3rd party libraries or something
like that, which makes it illegal to sell (and maybe even distribute
but i'm unsure) standalone versions or pretty much anything involving
max itself.
Post by g***@columbia.edu
Post by Scott Wheeler
I've
thought about writing a dummy VST plugin that just sets up the right
connections and calls the ChucK executable with a given ChucK
file.
I've made some pluggo plugins w/ ChUck that work well using
[chuck~]. Fun stuff!
I don't know much about the max/msp policy, but I think Cycling
imagines it as a development environment and standalones/plugins
are not covered by any Cycling rights-claims. There is a weirdness
with Windows, if anyone is interested I recall some discussion in
the archives.
Also, apologies for not updating [chuck~] recently. As soon as the
new version of cHUck gets set I'll get a new one out. Life
intervenes...
brad
http://music.columbia.edu/~brad
_______________________________________________
chuck-users mailing list
https://lists.cs.princeton.edu/mailman/listinfo/chuck-users
Spencer Salazar
2007-03-07 17:57:26 UTC
Permalink
V: A funny thing about the organ. I've tried playing very fast in
WIndows
and it seems to me As though fast input is auto-quantized to some
pretty
coarse note value. I suppose this is a limitation of the keybord
scanning,
right?
The quantization in that example is actually enforced by that
specific ChucK program. If you check out the code, you might notice
an 80::ms => now; that is executed right after each key press.
Removing that line will remove the quantization--then it should be
able to keep up with faster input, unquantized.
[MIDI]
I mentioned MIDI processing as one potential use and find the
current API a
bit raw. It would be great if I could transparently handle either
off_line
MIDI data in a file or stuff that's coming in on the wire in real
time.
Further more, if it is off-line, would be very cool if the data
could be
idealized a little: note on/off pairs to note events with length, a
bunch of
controller messages to 14-bit NRPN and RPN messages and full sysex
messages
whose checksum would be auto-calculated basedd on the manufacturer
ID, you
get the picture. I think the current API is about as raw as in VSt
which
means getting the job done but reinventing wheeels in the process.
Ideally,
I'd like to work at the musician level without having to know about
running
status, handle channels by masking off nibbles in binary data and
recalling
or defining constants for the various event types. A Java-like API
with
dedicated event classes for each one would be great. Even a more
primitive
presentation as in MIDI Perl using arrays would be quite all right.
Yes, some expansion/abstraction of ChucK's MIDI implementation has
been discussed before, and would be pretty beneficial.

spencer
Kassen
2007-03-07 20:16:40 UTC
Permalink
Post by Veli-Pekka Tätilä
Hi Kas and Spencer,
As I wanted to comment both of your replies at once, I'm switching quoting
style in mid-thread, IF this new one is problematic, let me know in a reply
and I'l'll change back. AT least it makes it clear with screen readers who
is saying what without having to parse "greater than x 3" in my head,
<smile>. Feel free to snip heavily.
Very good!
Post by Veli-Pekka Tätilä
V: Ah I see, sounds good to me. A classic accessibility question, does
that
panel UI have the concept of keyboard focus, tab order and the ability to
interact with the controls using the same hotkeys for knobs that Mac and
WIndows use for sliders? THese are the most frequently overlooked points
in
synth UI design, in terms of accessibility, and make life hard if your
primary input medium is keyboard and output comes via synthetic speech.
I agree. A while ago I started covering my desk and USB ports with gaming
devices and so I tend to use the keyboard for those things a lot more since
there is no more space for a separate mouse :-)
Post by Veli-Pekka Tätilä
V: That's right, and also because sighted folks might like precise
keyboard
control from time to time. Precise adjustment of values, and the ability
to
type in values using numbers or modal dialogs would rock. I've seen a
zillion synth UIs in which I'd like to type in an exact cutoff value and
cannot, though I have 102 keys at my Disposal.
Indeed. Envelopes in Ableton Live come to mind.
Post by Veli-Pekka Tätilä
V: It's funny you mention this. I didn't think of live performance when I
looked into ChucK, although I imagine it could be of great value in that,
too. As I input virtually everything via MIDI in my music, and record it
live in a seq, the live aspect of chucK wasn't my cup of tea, though.
Well, livecoding aside ChucK can work as a live instrument exactly like you
want it to work. It takes MIDI, HID and OSC so the sky is the limit.



What
Post by Veli-Pekka Tätilä
I'm looking for is a more accessible equivalent to Reaktor eventually,
whose
UI has been going down and down for years, in terms of accessibility.
NAmely
the ability to patch modular synths together and process MIDI eventes
either in real time or off-lien in MIDi files like you would in Cakewalk
application language. Another interest is as a simple testbench for audio
and MIDi mangling ideas comparable to VST but much easier and faster to
work
with.
I think it will suit this role very well. I found development in ChucK to be
very fast. MIDI file in and export aren't here yet but that seems like a
useful and obvious future inclusion.



I've noticed that graphical programming, as in Reaktor, has a limit after
Post by Veli-Pekka Tätilä
which it would be faster and more natural to express those ideas in code,
particularly if you have to use magnification and speech to begin with.
Laying out math formulae as a series of modules is one obvious example.
And
another is counting stuff. To count a 16-bit integer quantity in Reaktor,
I
Would have to worry about the range of events and chain too binary
counters
0 => int myCounter; // And that's that.
++myCounter; // to increment, and modulo to wrap.
Yes, indeed. I have no experience with Reaktor but I've been with the Nord
Modular sine the beginning, I personally found that being able to be very
precise about the order in which things are computed is a huge benefit.
That's not so easy to express in graphical systems. In programs like MAX
it's implied in the layout but I'd rather use the layout to express
something about the structure of the program which might not be the same
intuitively all the time.

In Tassman I have at times run out of space on the screen in complicated
patches.

[interface sonification]
Post by Veli-Pekka Tätilä
V: I wonder if that's bene studied. As part of my Uni graduation stuff for
investigating issues of current screen readers from the user point of
view,
which is just in its infancy, I've read plenty of stuff on accessibility.
I
do know auditory icons have been researched.
Yes, most obvious is things like Windows making a click sound when you click
your mouse. However all serious DAW users turn that off for obvious reasons.
I have never heard of interface sonification for musical effect in music
programs beyond the metronome, Still, I'm finding the idea intriguing. Right
now I have a prototype that works like this. In my sequencer you can select
a step that you might want to add a beat or note to (or remove one from,
etc). I made it so that if the selected step comes up in the loop a click is
heard in the main mix. Since that already depends on the sequencer itself
it's inherently quantised and I picked a click that suits the other sounds.
This is a big help in programing beats while not having any visual feedback
and it blends with the rest quite naturally, in fact I found my self
"playing" those clicks already. I'm now thinking about ways to extend this
and elaborate on it.

I'm open for ideas here or links to research. Beyond organ players insisting
on key clicks I'm drawing blanks with regard to interface sonification for
music programs. For video games it's a different matter, games like Rez and
Luminess use this sort of thing all over the place to great effect but
neither of those would be so much fun if you can't see well. Space Channel5
(by the same designer as the other two) should be playable with bad vision.
The Csound book has a interesting article on sonifying error messages in a
way to indicate the sort of error or event that occurred, that's interesting
too.
Post by Veli-Pekka Tätilä
V: Quite nice, actually. But where do you store the sequences and in what
kind of a format? The japanese music macro language (mML) would be
especially nice for expressing modulation and monophonic parts in
ASCIIbetical notes. I mean, I've used it to compose mini tunes for the
8-bit
Nintendo on the PC. I still lack an official spec, though I've written a
simple MML parser for myself in Perl.
Oh, they are just arrays. When loading the program these are empty; you have
to play it to get something and it's recording or enjoying it while it's
here because after you shut it down the music is gone forever. This is on
purpose, actually, it's a conceptual thing. I've been called crazy already
<grin>. I can store and recall a few patterns while it runs so it's feasible
to write little songs in it.
Post by Veli-Pekka Tätilä
V: But what kind of params would you adjust via timing? tap tempo comes
very
naturally but I don't know about the rest. Of course, switching patterns
needs to be done in time, too.
Nearly everything, actually. For example; I have a "accent" function. Say
I'm editing my leadline and I press the "accent" button that's on the
Beatmania controller (a PlayStation one). The program will then wait for the
Veli-Pekka Tätilä
2007-03-08 21:34:02 UTC
Permalink
Kassen wrote:
[accessibility of the knobby panel UI]
Post by Kassen
does that panel UI have the concept of keyboard focus, tab order and the
ability to interact with the controls using the same hotkeys for
knobs that Mac and WIndows use for sliders? THese are the most
frequently overlooked points in
synth UI design, in terms of accessibility, and make life hard if
your primary input medium is keyboard and output comes via synthetic
speech.
I agree. A while ago I started covering my desk and USB ports with
gaming devices and so I tend to use the keyboard for those things a
lot more since there is no more space for a separate mouse :-)
I see, and most apps let you also move the mouse or send out keybord
messages from gamepads, if all else fails. At .least the Gravis software
does. ANother point is that some sighted folks don't like the mouse either.
I do know people who positively hate drag and drop. Usability books also
consider Uis that require right clicks, dragging or chord clicking as
potentially unusable for new mouse users.

[uses]
Post by Kassen
like you want it to work. It takes MIDI, HID and OSC so the sky is
the limit.
Except that I don't seem able to clone my favorite Reaktor modules in ChucK
natively just yet and have to hit the C plus plus territory for that, arrgh
manual memory management be damned. I've been coding the last five years
recreationally and so have had pretty much a free choice in terms of langs,
apart from uni projects. SO I've grown to like the high-levle stuff, namely
Perl and Java. So C plus plus is not my thing at all, if I can avoid having
to use it, though Unix and C are a bit of an exception. THat's why I'm glad
ChucK is what it is.
Post by Kassen
I've noticed that graphical programming, as in Reaktor, has a limit
after which it would be faster and more natural to express those ideas
in code <snip>
been with the Nord Modular since the beginning <snip> found that being
able
to be very precise about the order in which things are computed is a
huge benefit. That's not so easy to express in graphical systems.
That's right. Although you don't have parens in arithmetic expressions in
modulars, as the signal flow determines precedence. Which reminds me how do
ChucK and Nord Modular handle divide by zero? Most langs I know throw some
sort of exception or call a signal handler. However, obviously the event
divide module in Reaktor cannot simply stop processing at any point,
especially in a live situation. I think IEEE floats have the types not a
number and infinity so maybe dividing by 0 should give infinity, at least
intuitively. Does ChucK support these special values?
Post by Kassen
In Tassman I have at times run out of space on the screen in
complicated patches.
At that point it is time to modularize. Kind of like the old programming
rule of having a method or function limit of about an A4. IF it's longer
than that, that's a bad smell. Of course, there are exceptional situations
e.g. where avoiding the overhead of a function call is a major performance
advantage.

[seqs]
Firstly about auditive interfaces.
Post by Kassen
Yes, most obvious is things like Windows making a click sound when
you click your mouse.
Totally pointless, at least to most people. But I've noticed after assigning
different beeps for questions, warnings and notifications, that it is far
quicker for me to detect what event it is based on a sound. i also tried
samples of the screen reader saying the type very fast but processing that
took a lot more mental CPu time. So I guess auditory icons might have their
place after all.
Post by Kassen
if the selected step comes up in the loop a click is heard in the
main mix.
This is a big help in programming beats while not having any
visual feedback
Sure, sounds interesting. Which reminds me, I've used the Electribe range of
hardware groove boxes and their lights and LCD displays are big enough for
me to read, if I peer at the device closely. So I find the visual feedback
of selected steps helpful in that context.

I also have an Alesis SR16 and in that counting steps manually, going in and
out of the step edit mode to play is about as clumsy and frustrating as it
gets. All context is dependent on being able to retain a consistant mental
model of where you're going. Correcting a single live-input note timing in a
pattern using such a system is wel, awful.

One thing I'd like is the ability to shift individual steps left or right
one or more steps, while having a loop close around that point in the
pattern. WHen I do use a step seq, which is not often, I tend to input half
of the things quantized live, do som editing and add the rest using the seq.
In that context I often need to wait for 16-bars to move a single note
around only to find out that it needs to be shifted one or more steps still
to arrive at the desired rhythm.

You asked about ideas. I like the fact that you can select a step for
editing and if you hit a key at a time where there's no step, the next one
will be selected. HAndy in deed.

Although it's not very accessible, I like the controls in Fruity very much,
too. Rotating the pattern left or right is nice and there's a fill function
which could be described as function fillNth(int startPos, int n) in ChucK.
That's certainly cool and a real time saver even though I'm going to remove
some of the notes it adds and add others, too.

In fact, I've been thinking of creating a Fruity inspired drum machine in
some lang, which just might be ChucK: IS there already sampler modules and
the ability to layer and add basic effects to drum sounds without having to
know DSP? ChucK could do the timing in this.

I also have the C64 SID chip on a hardsID PCI card and think it would be
cool to use that as a drum machine. SO I'm thinking of creating a C program
for editing drums in it on the C64 itself. But is it possible to call Win32
DLL functions in ChucK? The HardSID has a Windows DLL interface for reading
and writing the SId registers. In such use, having ChucK supply the timing
could be a great benefit. If DLls are out what options do I have for
inter-process communication within ChucK?

My last idea concerns the ability to enter notes from the keyboard. ONe of
the things I hfind hard in step seqs, be it hardware, Fruity or something
else is knowing which step is which as I cannot see them all in sufficient
detail. That means I'll have to count, say to add a hit every 5th note or
something.

And then it occurred to me. The function keys in a desktop PC keyboard
already have gaps at 4-key intervals. SO Why not use them for randomly
accessing steps as follows:

step key
1-4 f1-f4
5-8 f5-f8
9-B f9-f12
C-F print screen, scroll lock, break, and esc
Post by Kassen
drawing blanks with regard to interface
sonification for music programs. For video games it's a different
matter, games like Rez and Luminess use this sort of thing all over
the place to great effect but neither of those would be so much fun
if you can't see well. Space Channel5 (by the same designer as the
other two) should be playable with bad vision.
I'm out of the loop as far as new games go, so I'll use Wikipedia to find
out more. It's totally OT here but incidentally I also have a long thread
going regarding the accessibility of computer games for partially sighted
folks:
http://www.game-accessibility.com/forum/viewtopic.php?pid=1044#p1044

[future]
Post by Kassen
I mentioned MIDI processing as one potential use and find the
current API a bit raw.
It is. There have been more comments on this and that too is on our
wish list. ChucK is a lot like Christmas for small kids; huge wish
lists.
Oh yes, and a polymorphic Santa, depending on locale, <grin>. better play
nice here so we might actually get some presents in a future release of
Chuckmas.
--
With kind regards Veli-Pekka Tätilä (***@mail.student.oulu.fi)
Accessibility, game music, synthesizers and programming:
http://www.student.oulu.fi/~vtatila/
Kassen
2007-03-09 01:38:51 UTC
Permalink
Veli-Pekka Tätilä wrote:

I see, and most apps let you also move the mouse or send out keybord
Post by Veli-Pekka Tätilä
messages from gamepads, if all else fails. At .least the Gravis software
does. ANother point is that some sighted folks don't like the mouse either.
I do know people who positively hate drag and drop. Usability books also
consider Uis that require right clicks, dragging or chord clicking as
potentially unusable for new mouse users.
As a interface for musical programs I think the mouse has some serious
issues in that normal mouse usage takes hardly any advantage of muscle
memory at all. This closes the door on on many of the things we associate
with becoming proficient (and expressive) with a instrument. That might not
be a big deal when the program is used to manage a recording session of a
band but when it becomes the main instrument I think it leads to
questions... It might sound silly at first but I'm starting to feel it's a
real issue that nobody talks about playing Cubase or about virtuosity in
Logic.
Post by Veli-Pekka Tätilä
[uses]
Except that I don't seem able to clone my favorite Reaktor modules in ChucK
natively just yet and have to hit the C plus plus territory for that, arrgh
manual memory management be damned.
That might also be a matter of getting used to ChucK. At first I often tried
the known modular-system-style approach to any problem I ran into because
that was what I was used to. In my own experience there are often other
solutions that tackle problems in a very different way and end up looking
more logical and simple.



That's right. Although you don't have parens in arithmetic expressions in
Post by Veli-Pekka Tätilä
modulars, as the signal flow determines precedence. Which reminds me how do
ChucK and Nord Modular handle divide by zero? Most langs I know throw some
sort of exception or call a signal handler. However, obviously the event
divide module in Reaktor cannot simply stop processing at any point,
especially in a live situation. I think IEEE floats have the types not a
number and infinity so maybe dividing by 0 should give infinity, at least
intuitively. Does ChucK support these special values?
In ChucK I simply avoid divide by zero so I'm not sure. Dividing anything
(by a variable) at all in the NM is tricky business. You need to build your
own cerquit for it and so a divide by zero is what you make it. I don't
think I ever needed a literal general purpose divider there but I did once
build a modulo function that worked using a sort of feedback loop for
recursion.

You can do a lot in the NM but it's not the right tool for every job, math
quickly becomes quite inefficient.
Post by Veli-Pekka Tätilä
In Tassman I have at times run out of space on the screen in
Post by Kassen
complicated patches.
At that point it is time to modularize. Kind of like the old programming
rule of having a method or function limit of about an A4. IF it's longer
than that, that's a bad smell. Of course, there are exceptional situations
e.g. where avoiding the overhead of a function call is a major performance
advantage.
Yes, true, but modularising becomes very hard in graphical systems if there
is a lot of interconnection within the system.


[seqs]
Post by Veli-Pekka Tätilä
Firstly about auditive interfaces.
Totally pointless, at least to most people. But I've noticed after assigning
different beeps for questions, warnings and notifications, that it is far
quicker for me to detect what event it is based on a sound. i also tried
samples of the screen reader saying the type very fast but processing that
took a lot more mental CPu time. So I guess auditory icons might have their
place after all.
They do, but the could be improved a lot.

Often they seem to come down to shouting "watch out!". That's better then no
warning at all but not quite as good as shouting "car!", "tiger!" or
"jump!". In some situations it might even be worse then nothing at all.
Post by Veli-Pekka Tätilä
Sure, sounds interesting. Which reminds me, I've used the Electribe range of
hardware groove boxes and their lights and LCD displays are big enough for
me to read, if I peer at the device closely. So I find the visual feedback
of selected steps helpful in that context.
I agree, those electribes are a great example, at times I keep mine synced
to Ableton just for the running leds and having a constant reminder of where
in the loop we are. Much more clear then the little screen icons.



I also have an Alesis SR16 and in that counting steps manually, going in and
Post by Veli-Pekka Tätilä
out of the step edit mode to play is about as clumsy and frustrating as it
gets. All context is dependent on being able to retain a consistant mental
model of where you're going. Correcting a single live-input note timing in a
pattern using such a system is wel, awful.
I'm in total agreement; that one (I know the MMT8) is a nightmare. At least
it's nice for all us home brew developers to be able to say we have a better
interface then some commercial models with very little effort. <smile>
Post by Veli-Pekka Tätilä
You asked about ideas. I like the fact that you can select a step for
editing and if you hit a key at a time where there's no step, the next one
will be selected. HAndy in deed.
I have something similar already but yes; that's a good idea. Dealing with
consecutive events that way seems very natural.



Although it's not very accessible, I like the controls in Fruity very much,
Post by Veli-Pekka Tätilä
too. Rotating the pattern left or right is nice and there's a fill function
which could be described as function fillNth(int startPos, int n) in ChucK.
That's certainly cool and a real time saver even though I'm going to remove
some of the notes it adds and add others, too.
I had something like that in my old version that used the keyboard and I'd
like to have it back but I'm running out of buttons on my joystick <smile>.
Maybe I need another gaming device!



In fact, I've been thinking of creating a Fruity inspired drum machine in
Post by Veli-Pekka Tätilä
some lang, which just might be ChucK: IS there already sampler modules and
the ability to layer and add basic effects to drum sounds without having to
know DSP? ChucK could do the timing in this.
Yes, there is. Our sampler is called SndBuf and it's quite good. There are
effects and filters amongst the Ugens but you can also and easily use the
way you address the SndBuf as a effect itself in the same style as tracker
users do.



I also have the C64 SID chip on a hardsID PCI card and think it would be
Post by Veli-Pekka Tätilä
cool to use that as a drum machine. SO I'm thinking of creating a C program
for editing drums in it on the C64 itself. But is it possible to call Win32
DLL functions in ChucK? The HardSID has a Windows DLL interface for reading
and writing the SId registers. In such use, having ChucK supply the timing
could be a great benefit. If DLls are out what options do I have for
inter-process communication within ChucK?
I would think MIDI would be the most obvious choice. Doesn't the hardsID
support MIDI?
Post by Veli-Pekka Tätilä
And then it occurred to me. The function keys in a desktop PC keyboard
already have gaps at 4-key intervals. SO Why not use them for randomly
step key
1-4 f1-f4
5-8 f5-f8
9-B f9-f12
C-F print screen, scroll lock, break, and esc
He he he, Using the esc that way has a certain charm; it places the last
step of the loop to the left of the "1" so that's quite natural, even if it
does look odd.
Post by Veli-Pekka Tätilä
I'm out of the loop as far as new games go, so I'll use Wikipedia to find
out more.
I think Space Channel 5 might theoretically be playable with no vision at
all. At that point it does get a little sad that it's known for it's design
but it's remarkable that so many music games depend so heavily on the
screen. I think you could play DDR, Guitarhero or Beatmania with the music
muted which is actually a bit strange for a music game when it comes down to
it.



It's totally OT here but incidentally I also have a long thread
Post by Veli-Pekka Tätilä
going regarding the accessibility of computer games for partially sighted
http://www.game-accessibility.com/forum/viewtopic.php?pid=1044#p1044
Interesting stuff! I'm going to read that and steal ideas! <grin>


[future]
Post by Veli-Pekka Tätilä
Oh yes, and a polymorphic Santa, depending on locale, <grin>. better play
nice here so we might actually get some presents in a future release of
Chuckmas.
I think Ge's hat is at least as good as Santa's!

Kas.
Veli-Pekka Tätilä
2007-03-09 13:49:02 UTC
Permalink
[usability]
Post by Kassen
As a interface for musical programs I think the mouse has some serious
issues in that normal mouse usage takes hardly any advantage of muscle
memory at all. This closes the door on on many of the things we
associate with becoming proficient (and expressive) with a
instrument.
Hey that's something new I just have not thought about, thanks for an
enlightening idea. Maybe that's why usability folks are not too happy about
the trend of trying to look like the real thing in media players. Their
basic point is that although match with the real world is a viable
heuristic, that can be taken too far, too. TIny player buttons, knobs and
sliders etc... are controls originating in the real World and made with
human hands and limitations in mind. Yet the mouse, or any other pointing
device for that matter, was never ment for primarily modifying such
controls. That's just not very efficient and easy although it might be what
your average musician, who isn't a computer power user and knows the
harddware, has learned to expect.

For an example where real knobs shine, consider a knob that has a notch in
the center so you can easily zero a signed parameter. IF it is well done the
notch is prominent enough to make it easy to stop the knob yet is not on the
way when you try to rotate the knob fast over a large range. Do any soft
synths have such a feature? Rotating the knobs with the mouse lacks the
tactile feel. ANother point, if you have to drag away from the control to
manipulate it, it means a full-screen magnifier goes with the pointer and
you have to relocate the knob every time to adjust it further. You also
cannot follow its value or movment. So ffrom an accessibility point of view,
thats
very very bad. For an emulation type in magnify in the run box, pick 7x
magnification and after docking the magnifier at the botom, try using your
favorite synth with the magnifier alone. The Os X magnifier is called Zoom.

Interestingly, here is what the Windows interface guidelines say about mouse
operation:

Quote:
Providing a well-designed mouse interface is also important. Pointing
devices may be more efficient than keyboards for some users. When designing
the interface for pointing input, avoid making basic functions available
only through multiple clicking, drag and drop manipulation, and
keyboard-modified mouse actions. Such actions are best considered shortcut
techniques for more advanced users. Make basic functions available through
single click techniques.
End quote.

The Mac has similar conventions. But how many soft synths are usable without
D&D (drag drop) and how many synth designers have education in human
computer interaction?
Post by Kassen
I think Space Channel 5 might theoretically be playable with no
vision at all.
Interesting, I do know the ancient PC game quest for fame is. I'm able to
play it using the 640x480 reso using the rhythm view that is. But that's
seriously OT. Welll check out:
http://en.wikipedia.org/wiki/Quest_for_Fame
Post by Kassen
Post by Veli-Pekka Tätilä
It's totally OT here but incidentally I also have a long thread
going regarding the accessibility of computer games for partially
http://www.game-accessibility.com/forum/viewtopic.php?pid=1044#p1044
Interesting stuff! I'm going to read that and steal ideas! <grin>
Please do so we get better interfaces, though it wil probably take you n *
1::hour to digest it all. The intro post is one of my longest.

[design]
Post by Kassen
Post by Veli-Pekka Tätilä
Except that I don't seem able to clone my favorite Reaktor modules in ChucK
natively just yet and have to hit the C plus plus territory for that, arrgh
manual memory management be damned.> That might also be a matter of
getting used to ChucK. At first I
often tried the known modular-system-style approach to any problem I
ran into because that was what I was used to. In my own experience
there are often other solutions that tackle problems in a very
different way and end up looking more logical and simple.
So more like programming, well that's familiar territory for me outside of
ChucK its just I've yet to adopt that mindset in ChucK. Well, could you give
a concrete example?

Say I'd like to create a basic multiplexer for selecting OSc waves, one out
of three. Here's the first way that comes to mind:

Have a Ugen reference named osc and chuck it to DAC. Then switch on the
value of another variable using nested if else if else if constructs to
assign instances of the different oscs on that ugen ref, to select that
particular wave for output. Is that a good strategy?

What if I need to use a similar multiplexing structure in ten places in an
instrument and also the amount of inputs and outputs varies? COpy paste is
not an option, I'm unwilling to do that. Functions are not good enough
either as they would need to retain state across calls. So as we don't have
closures, apparently I need to create class files. Which brings me to my
unanswered questions about where to put class files, how to import them in
projects and so on?

[divide by 0 in modulars]
Post by Kassen
In ChucK I simply avoid divide by zero
Well, let's test it:

C:\audio\chuck-1.2.0.7-exe\bin>chuck.exe CON
<<< "Die: ", 1.0 / 0.0 >>>;
<<< "I'm a survivor." >>>;
^Z
Die: 1.#INF00
"I'm a survivor." : (string)

C:\audio\chuck-1.2.0.7-exe\bin>

So apparently there's the value inf. But What's the literal value I use for
comparing if something is infinite?

[seqs]
Post by Kassen
Post by Veli-Pekka Tätilä
Electribe range of hardware groove boxes
and their lights and LCD displays are
big enough for me to read, if I peer at
the device closely. So I find the
visual feedback of selected steps
helpful in that context.
I agree, those electribes are a great example,
Although they lack the Fruity style auto fill and rotation functions, which
I mentioend when you asked what kind of features would be cool in the ChucK
seq. But they are simple yet sufficiently powerful devices
Post by Kassen
Post by Veli-Pekka Tätilä
editing and if you hit a key at a time where there's no step, the
next one will be selected. HAndy in deed.
I have something similar already but yes; that's a good idea.
One exception, obviously if you use a function for adding new notes, you
Should be able to turn on silent steps, too.
Post by Kassen
Post by Veli-Pekka Tätilä
In fact, I've been thinking of creating a Fruity inspired drum
machine in some lang, which just might be ChucK: IS there
already sampler modules and the ability to layer and add basic
effects to drum sounds without having to know DSP?
Yes, there is. Our sampler is called SndBuf and it's quite good.
I'll read the manual more carefully, thanks. I hope it can mix multiple
sources or if that's not possible, have multiple instances control playback
acurately.
Post by Kassen
easily use the way you address the SndBuf as a effect itself in the same
style as tracker users do.
So adding FX in the sample level, you know that could work. The major hurdle
here would be the ability to use a common file open dialog or some other
browser to add samples. OR a dir which is polled for wave files periodically
and updates the samples used. Fruity's ability to auto preview samples in a
pattern is a real timesaver and the same is true of the Forge open dialog.
Post by Kassen
Post by Veli-Pekka Tätilä
I also have the C64 SID chip on a hardsID PCI card and think it would
be cool to use that as a drum machine. SO I'm thinking of creating a
C program for editing drums in it on the C64 itself. But is it possible
to call Win32 DLL functions in ChucK? The HardSID has a Windows DLL
<snip>
If DLls are out what options do I have for inter-process communication
<snip>
I would think MIDI would be the most obvious choice
It is but it is limited in terms of timing and range of parameters. YOu
cannot use controllers to freely sweep the noise frequency with ease, which
is trivial in C, especially on the C64 itself.
Post by Kassen
Post by Veli-Pekka Tätilä
The function keys in a desktop keyboard already have gaps at 4-key
intervals
step key
1-4 f1-f4
5-8 f5-f8
9-B f9-f12
C-F print screen, scroll lock, break, and esc
He he he, Using the esc that way has a certain charm; it places the
last step of the loop to the left of the "1" so that's quite natural,
even if it does look odd.
I don't find it very natural myself, but at least it retains the gaps in
intuitive places. It would be worse if esc was the first step. YOu could
maybe also blink the numlock, scrolll lock and caps status leads for
accents, portamento or whatever to indicate something about the current
step. Idea stolen from MAME.

Here's another mapping for accessing large patterns quicker:
f1-f4 pick one of the four stteps in the current beat supposing 1/16 notes
f5-f8 select the beat in a measure supposing 4/4
f9-f12,esc pick the measure in a 4-measure pattern to edit
This is much like binary numbres in that the right-most digits, if we number
bits in a little endian:ish way, control larger quantities.
--
With kind regards Veli-Pekka Tätilä (***@mail.student.oulu.fi)
Accessibility, game music, synthesizers and programming:
http://www.student.oulu.fi/~vtatila/
Kassen
2007-03-10 12:52:34 UTC
Permalink
Veli-Pekka Tätilä wrote:

Hey that's something new I just have not thought about, thanks for an
Post by Veli-Pekka Tätilä
enlightening idea. Maybe that's why usability folks are not too happy about
the trend of trying to look like the real thing in media players.
I think so, yes. With instruments I think it's worse then with media
players. Media players tend to be switched on and left to run, maybe some
tracks will be skipped, occasionally the user might rewind a interesting bit
and late at night the bass might be filtered out on the eq. Instruments will
be used in a more intensive way over a longer period. Fortunately a little
more thought goes into DAW controlls or drawig out expressive modulations
would be a total nightmare instead of a chore but I still think this a prime
example of how ChucK can be used to make our own tools where the standard
ones are lacking (see, not so OT at all!)


For an example where real knobs shine, consider a knob that has a notch in
Post by Veli-Pekka Tätilä
the center so you can easily zero a signed parameter. IF it is well done the
notch is prominent enough to make it easy to stop the knob yet is not on the
way when you try to rotate the knob fast over a large range. Do any soft
synths have such a feature?
Some softsynths try to get close to something similar by automatically
scaling the mouse sensitivity... That would lead to issues with the tablet I
experimented with and I doubt your maginfier likes it. For general usage it
might help.
Post by Veli-Pekka Tätilä
Interestingly, here is what the Windows interface guidelines say about mouse
Yeah... Once again the theory is there but practice is a sad affair. The
good news is that if you'd like to experiment with the mouse you can read
it's raw data in ChucK using the mouse Hid. That will completely ignore
where on the screen it is and allow you to link movement directly to sound
which could be interestig. Ignoring screen location might be a good thing if
you have trouble with vision anyway.


Interesting, I do know the ancient PC game quest for fame is. I'm able to
Post by Veli-Pekka Tätilä
play it using the 640x480 reso using the rhythm view that is. But that's
http://en.wikipedia.org/wiki/Quest_for_Fame
Ah, that looks like Guitar Hero except without the focus on the usage of the
screen. I'd love to find that controller, looks like serious fun.


[ChucK v.s. modulars]
Post by Veli-Pekka Tätilä
So more like programming, well that's familiar territory for me outside of
ChucK its just I've yet to adopt that mindset in ChucK. Well, could you give
a concrete example?
Generally with ChucK you depend more on modulations being based on time
periods and much less on extracting info from zero crossings in controll
signals. Another thing is that due to their structure multiple processes
running in paralel on modulars need to be re-synced often (likely every
cycle to avoid drift and edges arriving a fraction too early or late) with
ChucK that's not nesicary and timing is much more dependable. Even in
digital modulars (the Nords, Tassman, likely Reaktor and MAX as well)
vertical flanks will always be quantised to the sample rate which can and
will result in issues with timing and execution order.

That's the main thing, for me. On the downside: in ChucK going back and
forth between audio and controll signals tends to be cpu intensive. I'd
still like a way to generate events based on zero crossings without needing
to use a if-then loop based on ZeroX.last() at a rate of 1::samp because
those loops are so expensive.


[multiplexing]
Post by Veli-Pekka Tätilä
Say I'd like to create a basic multiplexer for selecting OSc waves, one out
<snip>

Yes, that's a good example.

On thing you could do is create a array and asign different osc's to the
locations in it. Even if you don't use that for adressing you could use the
ChucK and unChucK operators to connect and disconnect the oscilators. The
advantage of that is that ugens that don't end up at the DAC (inderectly) or
blackhole won't be computed which will save cpu.

That's of cource quite different from a normal multiplexer in architectue
but for 3 osc's it will save you about 2/3 of the cpu time which is quite
good and not something that a Nord Modular could do. Admittedly the NM has
real wave-form switching so in this particular case it's moot.


Have a Ugen reference named osc and chuck it to DAC. Then switch on the
Post by Veli-Pekka Tätilä
value of another variable using nested if else if else if constructs to
assign instances of the different oscs on that ugen ref, to select that
particular wave for output. Is that a good strategy?
Yes, i think so, particularly if you combine it with unchucking the
connection. A array to hold the locations and a function that would take the
one osc to be conected as a parameter could deal with it for you. The two of
those could be made into a class if you'd like to use the same thing a few
times.



What if I need to use a similar multiplexing structure in ten places in an
Post by Veli-Pekka Tätilä
instrument and also the amount of inputs and outputs varies?
Arrays, I'd think... That would depend on the exact needs.


Which brings me to my
Post by Veli-Pekka Tätilä
unanswered questions about where to put class files, how to import them in
projects and so on?
You can have any number of classes stright in your file. If you'd like to
import classes you'll need to make them public and put them in a seperate
.ck file (one public class per file) and machine.add those before using
them. This is dealt with in the manual and examples. It's not a 100% without
issues yet but that would be the way to go about it.
Post by Veli-Pekka Tätilä
So apparently there's the value inf. But What's the literal value I use for
comparing if something is infinite?
I have no idea, never ran into that. Ge or Spencer will know.

[sequencer design]
Post by Veli-Pekka Tätilä
Although they lack the Fruity style auto fill and rotation functions, which
I mentioend when you asked what kind of features would be cool in the ChucK
seq. But they are simple yet sufficiently powerful devices
Well, and cheap and cheerfull. They have some very good interface ideas but
the channel-bleeding on the outputs of mine is atrocious and the MIDI
implementation in braindead.
Post by Veli-Pekka Tätilä
One exception, obviously if you use a function for adding new notes, you
Should be able to turn on silent steps, too.
Yes, that's a area with big questions. I'm trying to make note input (with
regard to timing) uniform with beats but notes have so many other properties
as well that that sceme leads to exceptions in the interface. Ironing those
out in a sensible way is a good way of losing some sleep <grin>.
Post by Veli-Pekka Tätilä
I'll read the manual more carefully, thanks. I hope it can mix multiple
sources or if that's not possible, have multiple instances control playback
acurately.
You can have both and indeed; the manual and the examples will help there. I
warmly recomend editing/recombining and remixing the example files for
aspirering ChucKists.
Post by Veli-Pekka Tätilä
So adding FX in the sample level, you know that could work.
Yes, infact some of the basic otf example use sample-retrigering from drum
rolls. This is quite easy to do.

The major hurdle
Post by Veli-Pekka Tätilä
here would be the ability to use a common file open dialog or some other
browser to add samples. OR a dir which is polled for wave files periodically
and updates the samples used. Fruity's ability to auto preview samples in a
pattern is a real timesaver and the same is true of the Forge open dialog.
Ok, yes, that would be hurdle but you can asign a key to reloading the
sample you are using if you'd like to have a wave-editor next to ChucK.
Auto-scanning a directory isn't going to work just yet.


[sid]
Post by Veli-Pekka Tätilä
It is but it is limited in terms of timing and range of parameters. YOu
cannot use controllers to freely sweep the noise frequency with ease, which
is trivial in C, especially on the C64 itself.
Then I suppose the only option will be adding a bit of C like that and
recompiling....


I don't find it very natural myself, but at least it retains the gaps in
Post by Veli-Pekka Tätilä
intuitive places. It would be worse if esc was the first step. YOu could
maybe also blink the numlock, scrolll lock and caps status leads for
accents, portamento or whatever to indicate something about the current
step. Idea stolen from MAME.
I'd like that too. I think a HID-out that could do that is on the list as
well.
Post by Veli-Pekka Tätilä
f1-f4 pick one of the four stteps in the current beat supposing 1/16 notes
f5-f8 select the beat in a measure supposing 4/4
f9-f12,esc pick the measure in a 4-measure pattern to edit
This is much like binary numbres in that the right-most digits, if we number
bits in a little endian:ish way, control larger quantities.
Yes, makes sense but needs some thought. Because I like to play live one of
my prime demands is that it needs to be usable in the dark, after drinks and
under stress.

Yours,
Kas.
Veli-Pekka Tätilä
2007-03-10 18:28:09 UTC
Permalink
Hi,
As you mentioned off-list some of the accessibility stuff is really OT here.
SO I'll skip and snip more than usual.


Kassen wrote:
[usability]
Fortunately a little more thought goes into DA controlls or drawig out
expressive modulations would be a total
nightmare instead of a chore but I still think this a prime example
of how ChucK can be used to make our own tools where the standard
ones are lacking (see, not so OT at all!)
I agree, although some sort of a GUI would be nice, but I know that's on the
wish list. Well, even if we ignore the controls, many soft Synth GUis have
serious issues. I've seen some that have functions only in the right click
context menu. Although I'm a big fan of context menus, it took me quite long
to discover it in a soft synth. Again the heuristics here are that the
right-click menu is a power-user feature, you should not depend on people
using it and so the functionality should be mirrored somewhere.
Post by Veli-Pekka Tätilä
For an example where real knobs shine, consider a knob that has a notch in
the center so you can easily zero a signed parameter.
Some softsynths try to get close to something similar by automatically
scaling the mouse sensitivity... That would lead to issues with the tablet
I experimented with
You mean a tablet PC, Yeah I can believe that. ANd how do you drag and drop
with a pen interface anyway? I'm sure the screen reader mouse emulation does
not like it, either. IT can drag and drop in software, but only if the
source and destination show up as different focucible objects to the reader.
Needless to say dragging an arbitrary amount away from the center of a knob
canot be emulated with current reader software.
The good news is that if you'd like to experiment with the mouse you
can read it's raw data in ChucK using the mouse Hid. That will
completely ignore where on the screen it is and allow you to link
movement directly to sound which could be interestig.
Yes, especially as mine has got five buttons. YOu could make it modulate
different things depending on which button or combo of buttons is pressed.

[Quest for Fame V-pick]
I'd love to find that controller, looks like serious fun.
The game is great fun but the controller is not. It is plugged in the serial
port and only transmits simple pulses when the pick is used. So I'd say it
is about as good as a single keydown on the keyboard for control purposes.

[ChucK v.s. modulars]
Generally with ChucK you depend more on modulations being based on
time periods and much less on extracting info from zero crossings in
controll signals. Another thing is that due to their structure
multiple processes running in paralel on modulars need to be
re-synced often (likely every cycle to avoid drift and edges arriving
a fraction too early or late) with ChucK that's not nesicary and
timing is much more dependable.
Yes, all timing is auto-synchronized to the sample the way I understand it.
Still ChucK does require rethinking common modular synth designs as you
say, in terms of how to implement them.
back and forth between audio and controll signals tends to be cpu
intensive.
is there a difference? I didn't know that. I was kinda hoping I could use
any audio signal for control or vice versa. Which raises questions like, how
do you convert between the two and what's the event rate? I think the
manual, though it has example code, lacks formalism in the style of say the
K&R C book, of which I'm a big fan.

[multiplexing]
Post by Veli-Pekka Tätilä
Say I'd like to create a basic multiplexer for selecting OSc waves,
Have a Ugen reference named osc and chuck it to DAC. Then switch on the
value of another variable using nested if else if else if constructs
to assign instances of the different oscs on that ugen ref, to
Yes, i think so, particularly if you combine it with unchucking the
connection. A array to hold the locations and a function that would
take the one osc to be conected as a parameter could deal with it for
you.
I see, I'l try something like that out. AS with so many other coding things,
the best way to learn is to write and make mistakes, <smile>. But I've
noticed that as a learner I usually like to start with a good book. In this
case, the manual is far from complete, so looking at the examples is
something I'll have to do eventually.
Post by Veli-Pekka Tätilä
What if I need to use a similar multiplexing structure in ten places
in an instrument and also the amount of inputs and outputs varies?
You can have any number of classes stright in your file. If you'd
like to import classes you'll need to make them public and put them
in a seperate .ck file (one public class per file) and machine.add
those before using them.
AH good info,, I think this will get me going, thanks. I read about class
files but diddn't realize I could use the same add operation to include
code.

[sequencer design]

[Electribes]
and the MIDI implementation in braindead.
Yes, agreed. Anyone who claims that using NRPNs for about 50 parameters and
only in 7-bit ranges is a good idea is, well, not wanting to make their
interface musician friendly. Controllers and multiple channels would do
equally well with less overhead.

[back to your ChucK seq]
Post by Veli-Pekka Tätilä
One exception, obviously if you use a function for adding new notes,
you Should be able to turn on silent steps, too.
Yes, that's a area with big questions. I'm trying to make note input
(with regard to timing) uniform with beats but notes have so many
other properties as well that that sceme leads to exceptions in the
interface. Ironing those out in a sensible way is a good way of
losing some sleep <grin>.
Yes, and you might also want to add the ability not to quantize user input.
The Alesis SR-16, whose step editing I already dissed, actually is one of
the few drum machines which support both quantized and unquantized input.
Post by Veli-Pekka Tätilä
The major hurdle here would be the ability to use a common file open
dialog or some
other browser to add samples. OR a dir which is polled for wave files
Ok, yes, that would be hurdle but you can asign a key to reloading the
sample you are using
Yes, that could work. Another thing I'm going to try at some point is tap
tempo and then the ability to loop a portion of a song when you hit a key,
such that you can determine the unit in looping, as musical time. I've
noticed that when I use the software based volume control buttons on my
laptop, Winamp playback studders in a cool and half-musical way. If only I
could sync that and make the process more controllable ...
Post by Veli-Pekka Tätilä
f1-f4 pick one of the four stteps in the current beat supposing 1/16
notes f5-f8 select the beat in a measure supposing 4/4
f9-f12,esc pick the measure in a 4-measure pattern to edit
Yes, makes sense but needs some thought.
This would be a good way for selecting one step for editing or two steps in
defining a range, even in a 4 measure pattern. But the downside here is that
it is hard to sweep through say 10 steps. Where as you can use your finger
to quickly sweep a range of buttons in a hardware machine or do a drag with
the left mouse button in Fruity, to simulate that. But nothing would stop
you from adding a func for defining two steps and filling the range between
them.
one of my prime demands is that it needs to be usable in the dark,
Which is interesting. ONe academic paper I read recently on accessibility
said that in addition to aiding say people With no sight, accessibility also
helps in special circumstances like smoky or dark environments. So here we
are.
--
With kind regards Veli-Pekka Tätilä (***@mail.student.oulu.fi)
Accessibility, game music, synthesizers and programming:
http://www.student.oulu.fi/~vtatila/
Kassen
2007-03-10 19:45:02 UTC
Permalink
Post by Veli-Pekka Tätilä
Hi,
As you mentioned off-list some of the accessibility stuff is really OT here.
SO I'll skip and snip more than usual.
It all kinda interacts. I imagine there must be more people who want to put
ChucK in clubs.
Post by Veli-Pekka Tätilä
Again the heuristics here are that the
right-click menu is a power-user feature, you should not depend on people
using it and so the functionality should be mirrored somewhere.
I think you can seriously wonder how suitable sub-menus are for instruments
at all. Verious synths -including ChucK if we look at programing and using
the HID/MIDI/whatever a seperate- use two different interfaces for
configuring and playing.

It gets more tricky with pro-level keyboards and grooveboxes and so on. I
don't think sub-menus are such a good idea there. Generally I think many
instruments would be better with less but more carefully planned out
features. The piano lasted centuries already; I don't think the mc303 will.


Acessibility and playability have a huge overlap. You might find this PDF
worthwhile, it's by ChucK co-author Perry Cook and on basically this topic;
http://soundlab.cs.princeton.edu/publications/prc_chi2001.pdf

You mean a tablet PC, Yeah I can believe that.


Actually I meant one of those graphic designer tools to replace a mouse with
a pen and use absolute (not relative) controlls. I had hoped the absolute
desk-screen mapping would make me faster thanks to muscle memory but instead
auto-scaling resulutions made it very unpleasant affair in Ableton.


ANd how do you drag and drop
Post by Veli-Pekka Tätilä
with a pen interface anyway?
Exactly like with a mouse or with a chopstick on a dinner-plate.
<grin>

That's not the hard bit, asuming you are used to a mouse anyway.

I'm sure the screen reader mouse emulation does
Post by Veli-Pekka Tätilä
not like it, either. IT can drag and drop in software, but only if the
source and destination show up as different focucible objects to the reader.
Needless to say dragging an arbitrary amount away from the center of a knob
canot be emulated with current reader software.
That would probably mean the same effect would be seen when drawing
envelopes.
Post by Veli-Pekka Tätilä
Yes, especially as mine has got five buttons. YOu could make it modulate
different things depending on which button or combo of buttons is pressed.
There are some examples here;
http://smelt.cs.princeton.edu/


[Quest for Fame V-pick]
Post by Veli-Pekka Tätilä
The game is great fun but the controller is not. It is plugged in the serial
port and only transmits simple pulses when the pick is used. So I'd say it
is about as good as a single keydown on the keyboard for control purposes.
Oh. Right. That's disapointing. I had hoped it would be based on tilt/
inertia sensors.



[ChucK v.s. modulars]
Post by Veli-Pekka Tätilä
Yes, all timing is auto-synchronized to the sample the way I understand it.
Still ChucK does require rethinking common modular synth designs as you
say, in terms of how to implement them.
Exactly. Sticking to "modular tactics" will lead to ugly code and much cpu
overhead.
Post by Veli-Pekka Tätilä
back and forth between audio and controll signals tends to be cpu
intensive.
is there a difference? I didn't know that. I was kinda hoping I could use
any audio signal for control or vice versa.
Well, you can if you need to.


Which raises questions like, how
Post by Veli-Pekka Tätilä
do you convert between the two and what's the event rate?
You can convert controll signals (the code you write yourself) to audio
using either "step" or "impulse", both are Ugens you might want to look up.
You can get audio-rate values to base
conclusions/events/modulations/whatever on by polling myugen.last(). The
event rate is whatever you make it; it equates directly to the amount of
time you advance each loop itteration.

However; to detect things like edges and zero crossings you'll need to
advance time by just one sample each itteration and loops like that eat a
lot of cpu.


I think the
Post by Veli-Pekka Tätilä
manual, though it has example code, lacks formalism in the style of say the
K&R C book, of which I'm a big fan.
Fair, but our manual costs less money then that book (in fact it's free) and
hence it has a volunteer maintainer who's currently bussy with more urgent
stuff. To asist in that I started a page on the WIKI to list things that are
unclear to people or outdated, missing or downright likely to cause nuclear
melt-down or family dramas.

If there's something identifiable that you find confusing or unclear then
you are mroe then welcome to join in; anybody can join the WIKI and add
pages.
Post by Veli-Pekka Tätilä
I see, I'l try something like that out. AS with so many other coding things,
the best way to learn is to write and make mistakes, <smile>. But I've
noticed that as a learner I usually like to start with a good book. In this
case, the manual is far from complete, so looking at the examples is
something I'll have to do eventually.
A book would be nice yes, but that's far off. I think the closest thing yet
is Ge's accedemic papers that basically attempt to convince that running
ChucK is a good idea at all.

C does have books but C is quite old, proven to be usefull, not widely
regarded as likely to explode nor is it -currently- a very experimental sort
of thing. Perhaps most importantly; when you confess to using C people don't
look at you attempting to determine wether you are mad. This is all quite
unlike ChucK <grin>.
Post by Veli-Pekka Tätilä
AH good info,, I think this will get me going, thanks. I read about class
files but diddn't realize I could use the same add operation to include
code.
You can. Adding a file that has a public class to the VM will instantly make
that class available to other ChucK programs. In fact you could allways
start Chuck using;

chuck --loop my_class1.ck my_class2.ck my_class3.ck

Perhaps through a batchfile, and all your classes would always be there for
you to use.
Post by Veli-Pekka Tätilä
Yes, and you might also want to add the ability not to quantize user input.
The Alesis SR-16, whose step editing I already dissed, actually is one of
the few drum machines which support both quantized and unquantized input.
That's a thing for the future. Considdering the latency that the Windows
drivers give I'm quite happy to have implemented realtime input
quantisation... This is still good fun, I can now hammer buttons randomly
and the output will still be in perfect sync.
Post by Veli-Pekka Tätilä
Yes, that could work. Another thing I'm going to try at some point is tap
tempo and then the ability to loop a portion of a song when you hit a key,
such that you can determine the unit in looping, as musical time.
Go for it!


I've
Post by Veli-Pekka Tätilä
noticed that when I use the software based volume control buttons on my
laptop, Winamp playback studders in a cool and half-musical way. If only I
could sync that and make the process more controllable ...
Most likely you can controll it using "keyboard repeat rate", that's in your
Windows keyboard settings.
Post by Veli-Pekka Tätilä
one of my prime demands is that it needs to be usable in the dark,
Which is interesting. ONe academic paper I read recently on accessibility
said that in addition to aiding say people With no sight, accessibility also
helps in special circumstances like smoky or dark environments. So here we
are.
That's exactly what I was aiming at. Another thing I learned in a interface
design class was that people can learn a lot, interface wise, but as soon as
they panic this breaks down. The example given was aircraft design. That's a
good example of a highly complicated interface that needs to stay usable
under high stress....


Kas.

Dominik Leuthold
2007-03-05 22:06:50 UTC
Permalink
all
Post by Spencer Salazar
ChucK does use DirectSound on Windows, which indeed makes attaining
realtime latency problematic. ChucK isn't based on SDL, but rather
on RTAudio, which does have ASIO support, though ChucK doesn't
support that backend currently. So, ASIO is slowly making its way
into ChucK, but we also don't have an ASIO platform to test on,
making the process more difficult.
i'm working on windows and have a good soundcard with ASIO drivers.
(an rme hammerfall multiface).
it's pretty frustrating for me to have such a card and can't work
with it on chuck.
(i even can't use the card itself because it has only mme, gsif and
asio drivers...)

if i can help anything to get asio into chuck, i'm really willing
to do what i can. (write code, debuging, testing,... whatever you want)
just let me know:-)

regards
/moudi
--
"Feel free" - 10 GB Mailbox, 100 FreeSMS/Monat ...
Jetzt GMX TopMail testen: www.gmx.net/de/go/mailfooter/topmail-out
Continue reading on narkive:
Loading...