This file is indexed.

/usr/share/faust/api/README.md is in faust-common 0.9.95~repack1-2.

This file is owned by root:root, with mode 0o644.

The actual contents of the file can be viewed below.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
# `faust2api` Documentation

**NOTE:** this documentation was taken from the `faust2api` README.

## Overview

Thanks to its architectures system, [Faust](http://faust.grame.fr) can generate a wide range of objects compatible with different platforms (Linux, Windows, OSX, RPI, Android, iOS, ROS, Bela and Web/JavaScript) and tools (Max/MSP, SuperCollider, PD, VST, AU, LV2, etc.). The mechanism behind this was designed to be as modular and reusable as possible. Additionally, the Faust compiler can be embedded in any C++ program using the [LLVM](http://llvm.org/) technology making it very portable.
 
The goal of the `faust2api` project is to provide a tool to easily generate custom APIs based on one or several Faust objects. On one hand, [Faust DSP libraries](http://faust.grame.fr/libraries.html) implement hundreds of open source DSP algorithms that can be turned into C++, C, JAVA, JavaScript and LLVM bit code and embedded in your applications. On the other hand, Faust C++ libraries can carry out a wide range of tasks going from connecting Faust DSP objects to a specific audio engine (CoreAudio, OpenSL/ES, Alsa, Jack, etc.) or adding MIDI and polyphony support, sensor data handling, etc. to the same object.
 
This is an ongoing project and for now, only iOS and Android are supported. Our goal is to cover all the Faust architectures.

**NOTE:** This documentation only provides high level information on how to use `faust2api`. For detailed tutorials on this topic, visit [this page](https://ccrma.stanford.edu/~rmichon/faustTutorials/#adding-faust-real-time-audio-support-to-android-apps).

## Using `faust2api`

Elements generated by `faust2api` greatly vary from one platform to another. Thus, custom documentations are generated and integrated to packages generated by `faust2api` in function of the options provided to it. As a result, this page doesn't provide information on how to use the generated APIs, but rather high level instructions on how to configure `faust2api` to carry out specific tasks.

The various options of `faust2api` can be displayed at any point by running:

	faust2api -help

`faust2api` is part of the [Faust distribution](https://github.com/grame-cncm/faust). To use it, Faust must be properly installed on your system (read the Faust README in the previous link to get more information on how to do that).

### Android Support

To turn a Faust code into an Android API, just run the following command:

	faust2api -android yourFaustCode.dsp
	
The JAVA package name of the API generated by `faust2api` is `com.DspFaust`. It can easily be changed by using the `-package` option. For example, to change the package to `com.you.DspFaust`, just run the following command:

	faust2api -android -package com.you yourFaustCode.dsp
	
### iOS Support

To turn a Faust code into an iOS API, just run the following command:

	faust2api -ios yourFaustCode.dsp

### Customizing the API

#### Polyphonic Object

To create a polyphonic object, set the maximum number of voices of this object:

	faust2api -android -polyvoices 12 yourSynth.dsp

or

	faust2api -ios -polyvoices 12 yourSynth.dsp
	
In this case, we're creating an object with a maximum number of 12 polyphonic voices. Voices are only instantiated and computed when they are used, so this is just a safe guard...

#### MIDI Enabled Polyphonic Object

To create a polyphonic object controllable by a MIDI keyboard, run the previous command but make sure that the `freq`, `gain` and `gate` parameters are declared in your Faust code. E.g:

	import("stdfaust.lib");
	freq = nentry("freq",200,40,2000,0.01) : si.polySmooth(gate,0.999,2);
	gain = nentry("gain",1,0,1,0.01) : si.polySmooth(gate,0.999,2);
	gate = button("gate") : si.smoo; 
	cutoff = nentry("cutoff",5000,40,2000,0.01) : si.polySmooth(gate,0.999,2);
	process = vgroup("synth",os.sawtooth(freq)*gain*gate : fi.lowpass(3,cutoff) <: _,_);

`freq` (which should be a frequency) will be automatically mapped to MIDI note numbers, `gain` (which should be a value between 0 and 1) to velocity and `gate` to *note-on* / *note-off* events. Thus, `gate` can be used as a trigger signal for any envelope generator, etc.

Note that we're using the `si.polySmooth` function here to smooth the value of some parameters. This function is very useful when creating polyphonic objects as it only starts smoothing after the note was created, preventing "ugly" sweeps, etc. 

0.999 is the pole of the lowpass used for smoothing and 2 is the number of samples to wait before smoothing begins after the voice was created.

On iOS, you might choose to use the `-midi` option that will embed `RtMidi` support to the API. With this, any MIDI device connected to your iOS device will be able to control the Faust object. This option is not available on Android since MIDI events have to be retrieved from the JAVA portion of the app and forwarded to its native part using the `propagateMidi` method (see the README generated by `faust2api` for more infomration on this.)

#### Adding an Audio Effect to a Polyphonic Object

If creating a polyphonic synthesizer, it is very likely that you will need to plug its output to an audio effect like a reverb. Putting that effect in your main Faust code is not a good idea since it will be instanciated for each voice which would be very inefficient. Fortunately, `faust2api` can take a second Faust file as one of its arguments, containing an audio effect (could be a chain of audio effects, of course). The only rule is that the number of outputs of the synth must the same as the number of inputs of the effect. For example, your effect Faust file could look like this:

	import("stdfaust.lib");
	process = dm.zita_rev1;
	
`zita_rev1` is a stereo reverb with a built-in UI that can be connected to the synthesizer presented in the previous section. This code MUST BE saved in a file with the same name as your synth file and ending with `_effect.dsp`. For example, if the synth is saved in `yourSynth.dsp`, then the effect must be stored in `yourSynth_effect.dsp`. 

To generate the corresponding API, run:

	faust2api -android -polyvoices 12 -poly2 yourSynth.dsp	
or

	faust2api -ios -polyvoices 12 -poly2 yourSynth.dsp

Keep in mind that the package generated by `faust2api` contains a README file that you should really read at this point! Also, for more information, check our tutorials on [Using `faust2api` to Add Faust Audio Support to Android Apps](https://ccrma.stanford.edu/~rmichon/faustTutorials/#adding-faust-real-time-audio-support-to-android-apps) and [Using `faust2api` to Add Faust Audio Support to iOS Apps](https://ccrma.stanford.edu/~rmichon/faustTutorials/#adding-faust-real-time-audio-support-to-ios-apps).

## Additional Resources

* [Using `faust2api` to Add Faust Audio Support to Android Apps](https://ccrma.stanford.edu/~rmichon/faustTutorials/#adding-faust-real-time-audio-support-to-android-apps)

* [Using `faust2api` to Add Faust Audio Support to iOS Apps](https://ccrma.stanford.edu/~rmichon/faustTutorials/#adding-faust-real-time-audio-support-to-ios-apps)