airloaf / vssynth Goto Github PK
View Code? Open in Web Editor NEWSoftware synthesizer written in C++, using SDL 2 for its audio backend.
Home Page: https://airloaf.github.io/VSSynth
Software synthesizer written in C++, using SDL 2 for its audio backend.
Home Page: https://airloaf.github.io/VSSynth
The sequenced beat example project doesn't have good documentation explaining how the sequencer works. It also has a very barebones README file.
The readme should include:
The doxygen documentation doesn't show the Middleware namespace within the namespace list.
Now that we are able to read MIDI files for note events, we should create sequencers with the notes queued.
The piano and sequenced beat examples currently renders a blank white background. This should be replaced with something more interesting, probably a picture of piano keys with the respective note and keyboard key.
We should create a single header with all the VSynth includes. This can reduce the complexity of our examples and make it easier for everyone using this library.
The middleware interface should not be in the middleware folder. The middleware folder is strictly for middleware which is ready to be used.
A simple tone sound generator would be a nice addition as well as an easy way to introduce people to the synth. The tone sound generator will output some waveform passed in through its constructor. Unlike the instrument sound generator, it does not need an envelope.
The tone interface should look something like this.
Tone(wave); // Create a tone object with the given waveform
Tone.play(frequency); // Play the wave at the specified frequency
Tone.stop(frequency); // Stop playing the wave at the specified frequency
The Instrument class in VSynth does not support many features required of the MIDI standard. This includes: Note velocities, Pitch Bends, Patch/Program changes. The polyphonic instrument class in VSynth is not the most efficient to use for the MIDI standard as well. The polyphonic instrument class uses mutexes to turn on and off notes. This works well when there is a large or infinite number of possible notes, but the MIDI standard only has 128. It would be more efficient to ditch the mutexes for the MIDI standard.
Related Issues:
#20
Currently we copy all the files from build to the docs folder. We should instead use some type of script or package to publish it properly. The following link has a tutorial on how to do it.
https://dev.to/yuribenjamin/how-to-deploy-react-app-in-github-pages-2a1f
The sequencer provided by my VSynth library only supports note On/Off events. The MIDI format has many events that should be taken into account (Program Change, Pitch Bend, etc.). A new MIDI Sequencer class should be created to make use of these. Similarly, MIDI Note On events take a velocity as a parameter. The velocity can change how loud the note is played, which cannot be supported by the base Sequencer class.
Related Issues:
#21
If you create an envelope which is set to not sustain, the release portion of the ADSR curve does not trigger properly. It seems as though it only goes through the attack and decay portions.
There should be a way to customize the settings of the synthesizer. For example, my desktop can sample at a rate of 48K perfectly fine, but my laptop starts dropping samples at 48K.
Settings to specify:
A nice "marketing" website should be created for the project. It would make the library look more professional and possibly get more people to try out the library.
MIDI file format has a selection of 128 instruments an artist can choose from. Each of the 128 instruments can be divided into approximately 16 categories. You can find more in this link MIDI Patches/Program change Events.
Sphinx is a modern documentation generator that supports C++. We may want to consider using it.
Example Page: XTensor
The README.md does not properly state how to use the CMakeLists.txt file. There are certain variables that are specific to my development environment that should be changed on other people's environments. This should be stated in the README.md
A new sound generator capable of playing PCM audio (probably through WAV) would be pretty nice.
A middleware interface may be an interesting idea to implement. The Synthesizer will first collect all its samples from the sound generators. Then the synth will pass the total sample to the middleware. The middleware will return something back to the synth, either the original sample passed in or it could be modified. The synth will then pass the returned value to the speakers.
Here are some Middlewares that may be interesting to implement:
I think the WAV recorder would be the more important one to implement of the three listed here. With the interface other developers could try and implement their own middlewares.
Under use cases it says:
"There are many things you could possible make with VSynth!"
A WAV recorder will allow us to output the synthesized music created by VSynth into a .WAV. This would be a big addition, especially for the MIDI player created by this library. We may want to consider using the Middleware addition discussed in #10
Two interesting waveforms to add would be the Pulse and Noise waveforms.
The pulse waveform is similar to the square waveform, but instead of being on and off 50% of the time, it can be specified. (e.g. a 80% on 20% off).
The noise waveform is a random number generator. It works great for percussive sounding instruments.
We have two new examples: MIDI and Tone. They both should be added to the README with proper descriptions of what they do.
The Piano key map takes up a small portion of the upper left hand corner. It should probably be stretched across the screen.
The MIDI example doesn't use the synthesizer and doesn't produce any sound. It should be removed until an actual MIDI example has been started.
Make the sequencer loop when it runs out of notes to play.
LFO is put on the list of features, but we do not have a working example of such.
If you press a key multiple times with the envelope, the amplitude goes back to 0 immediately after calling the hold function. This creates a popping noise whenever this occurs. Similarly, when releasing a note when its at its peak produces a minor, but audibly noticeable pop. It would be better if the amplitude is tracked rather than starting over each time.
Ability to pause a sequencer or instrument would be nice. The sound generator class should support a pause() and unpause() function in order to achieve this.
The generators folder has a capital G in it. It looks very odd, especially since the utils folder has a lowercase u.
All sound generators have the same amplitude within the synthesizer. This should be configurable by the programmer. It's important to note that sound works in a logarithmic scale.
Now that we have the middleware complete #10, we can potentially add a lowpass filter as a middleware.
Create a simple logo to make the README look more interesting. Perhaps just the word "VSynth", but the S is a sine wave phase shifted to look like an S.
The open function says
"Opens a file for writing and spawns a writer thread."
We are no longer spawning a new thread. This should be changed in the documentation.
The sequence begins immediately. Some sequences may want to be played at an offset (e.g. 5 seconds after the start of the synthesizer).
When building the doxygen docs, the output goes into docs/html. It should instead go to docs/doxygen.
Now that we have the WAVWriter class, we should have an example showing how to record the synth using this.
The original intention with VSynth was to make an audio synthesizer using SDL2's audio backend. It may be desirable to support other libraries in the future and have the SDL2 synth be the default implemented synth.
New Classes:
Synthesizer (Abstract class that must implement the open, close, pause and unpause functions)
SDL2Synthesizer (Move the current Synthesizer code into here)
We should create a diagram showing how VSynth works. Ideally this will be on the website below features and before the code examples.
It show be a block diagram showing all the components and how data flows between them. It should also state somewhere that a separate thread handles all this.
First thing we would need from the MIDI player is to read a file for MIDI note events
Add support for the SFML library. SFML is one of the most popular graphics frameworks alongside SDL2. Both have powerful audio capabilities. It may be worthwhile to create a port for SFML.
Related Issues:
#11
When creating an envelope that does not sustain, it lasts a lot longer if the key is tapped, than if it is held down.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.