virtual environment for developing Ableton Live Midi Remote scripts in PyCharm


I'm an experienced Java/Groovy developer, employed as such even.  But I'm brand new to Python.  My sole motivation for diving in now is customizing midi controller scripts for Ableton Live. 

I'm trying to set up a "virtual environment" in PyCharm to support my Ableton Live Midi Remote Script scripting project(s), but I've hit a wall and need help.  I'm not sure which section of my vast supply of ignorance is responsible.

The only instructions I have are in a file buried deep in the Ableton Live folder structure.  Of course, the instructions make project setup seem easy as 1, 2, 3.

* Working on abl.webconnector

 - create and activate a virtualenv (e.g. with virtualenvmanager)
 - install in the virtualenv the Python site-packages that we ship with Live:

#+begin_src bash
$ find $LIVE_REPO_ROOT/modules/python/site-packages/ -type d -name *.egg -not -name setuptools* -exec easy_install {} \;

 - install requirements for development:

#+begin_src bash
$ pip install -r dev_requirements.txt

 - run nosetests:

#+begin_src bash
$ cd $LIVE_REPO_ROOT/products/live/Live/AppWebConnector/Python/abl.webconnector/
$ nosetests -s  --with-progressive --logging-clear-handlers tests

Unfortunately, I suspect these are instructions for programmers with access to the full git repo at Ableton for the Live product, rather than instructions for users like me.  For example, the path to site-packages is different in my installation.  I probably could have scripted something to mimic the 'find' command. But I ended up creating Run/Debug configurations to run that "easy_install" for each of the "*.egg" folders not named "setuptools*" in that directory.  After that I ran the 'pip install dev_requirements.txt' command in my PyCharm console (git-bash).  After that I ran the nosetests in my PyCharm console.  I made several minor changes before all tests passed.  The changes were:

# * removed --with-progressive arg (didn't exist)
# * installed a module named 'mock'
# * edited line 33/34 to the correct (local) path
# * added file to tests/data/  (copy of and compiled it to foo_py25_mac.pyc
# * -- error was at line 115 because foo_py25_mac.pyc wasn't in tests/data/ like and

However, even though all 124 tests are passing in the console, it seems something is still not right in the project environment.  I can see the "eggs" listed in the "virtual environment" but sys.path in the python console doesn't also show any of those "egg resources"; and, when I open code files, import statements that reference any of those specific "Ableton Live" components are flagged as errors. (things like import Live, or import MidiRemoteScript)

It feels like I am so close yet so far away.  My question here is, can anyone tell from this description whether or not my problem at this point is my ignorance of Pycharm and Python in general, or if its ignorance of the Ableton python API and how it works?  If it's a Python/PyCharm issue, what is the issue and how do I fix them? lol



Hi Jon,

I'm as well as you doing some midi remote scripting with Live and I'm quite a beginner in this field so I'm not sure I can be of great help to you.

Why do you want to work on ableton webconnector ? Isn'it something that has to do with connecting to the web to check the license ? But maybe I'm mistaken.

Ableton remote scripting is not something official so there is no support for it from Ableton. Unfortunately Ableton is closed source so it's normal your import Live appear red in Pycharm because they are not (and probably will not be unless you know someone at ableton :p) on your path. import Live works when you launch your remote scripts in Ableton though and there is some documentation to use the Live API here :  . Still it means it's not easy to run tests for example. Things are mostly tested live by reloading your set (reloads the scripts as well).

Also you should check out clyphx pro and especially clyphx pro user actions. It's a good way to start, at least it is for me. Then there is some documentation on the web to bootstrap a remote script, some of it quite old but still interesting. The link above allows previewing all the built in scripts and how they work (or you could just decompyle everything yourself). Just add them as a content route in pycharm and you're ready to go. If you're fine with Java I don't think you'll have trouble writing remote scripts but it's true it's hard to find good doc on the subject.

I'm using Pycharm as well and I think it's great for remote scripting.




ps : this is what I was referring too when talking about good but old doc : 

Or just copy paste an existing remote script like Push or something simpler and modify it !

You usually don't need many things in you venv as Live Python API and framework classes are loaded when you fire up a set.



im think im having a similar problem, can any of you help me set up my pycharm ide so i can start coding? (im not really experienced in python, mostly doing things in c++)

am i correct that i need to "import live" in order to use the documentation? where do i get "live", its not in the package manager and the repo doesnt have a "live" folder either :/ (im sure i missunderstood something, but i really cant figure out what)


any help would be apreciated !

PS: I took a look at this and I managed to import _Framework but the first example here is already not working for me :/ is this outdated? does someone have a more recent version of the code?


Hi Jonny !

For simple scripting there is not really anything mandatory to do with pycharm and live, it's just more confortable to set a few things up if you plan on working on it a little while.

Here is a few thoughts about remote script work in live with pycharm :

- Install and setup python2.7, Live didn't reach python 3 yet unfortunately. That's gonna prevent runtime failures.

- Setup your project structure so that pycharm looks into the right path (google it it's explained on pycharm site, how to add content roots). Here, you could just put your remote script folder and call it a day.

- Now if you want to have _Framework autocomplete (or just look at the code), you should download the decompyled sources (first link you posted) and add it as content root. Now for most code (like stock control surface) you could instead decompyle them in place (with uncompyle6). But for _Framework classes and ableton folder as well I hade problem with this approach so I would advice to download the decompyled folder somewhere else and add it as content root. Like this pycharm is happy with imports. The folder with uncompyled stuff is never gonna be read by Live, it's just for pycharm to be happy and you to read code.

- Now if you want to modify an existing surface script, you need to have it decompyled in place obviously (uncompyle6 to the rescue). Let's say you want to pimp your push2 (like I did). You can uncompyle and modify it or make a copy under a new name for example.

- Note that you can import and use any class or function from any remote script in your remote script folder (e.g. push2 code, or clyphx code). Files need not be decompyled in place to do so (e.g _Framework classes). .pyc files are enough.

- now for import Live this is closed source there is not lib to import. I just disabled pycharm inspections on import Live. Live is loaded when you boot ableton that's all. here is the api docs ( , ). 2nd one is more recent but not as thorough and well presented.

- The blogspot scripts are quite old yeah and they miss something in more recent scripts : "with self.component_gard()" that you should put almost at first in the constructor of you ControlSurface (it deals with dependency injection). Could be this. Anyway at some point it's gonna be better to look at existing decompyled code I realised. There is just not enough doc on the internet about this ..

- Personnaly I've learnt a lot by checking out clyphx pro code as well as push2 code (but push2 code doesn't use framework classes but ableton/v2). I would advide to use _Framework code which seems to be the "canonical" way of scripting ^^.

- This video from stray is great albeit too short : . He's making great code. I need to say that what is great with clyphx is that there is a good documentation about what can be done with a script .And then it's easier to go though the code to understand how he made it

- My code is certainly not as complex as clyphx but might interest you : . It's mostly a selected track control script with allows me to handle recording external synths more easily. I'm using it with a faderfox ec4 (just knobs note press and cc scroll). Bindings are in the ActionManager file. I've wrapped part of the Live object api because I thought it was cool but it seems people don't usually do this and instead use the api directly haha. Maybe it can help you.

Bests !


thanks a lot for your huge response,

i will work through everything you've suggested in the comming weeks, didnt have time yet!

i did find a pyhton live api stub -> and managed to run this with the 10.1 xml, accessing the api via javascript seems to be more common ( but if im right (correct me please if otherwise ^^) i could just fill the python api stub with the informations i can gather from the javascript version and have a usable api ? yea?

thanks a lot


EDIT: but i first gotta make anything work at all, the "with self.component_gard()" didnt make the most simple example from the old tutorial work


Hi Johannes, I didn't know about this github repo neither did I think about generating a stub from the xml, you rock !

I works like a charm that's brilliant I'm so happy haha.

Regarding javascript I don't feel competent enough to answer as I'm not really doing max for live development. From what I understand the js api is used only when embeded in a max patch. Exactly like the compusition tutorial. The api is the same underneath (I guess it's just bindings to C++). There seems to be slight difference when calling methods and js seems to be more verbose. More people are using max for live than python, but actually I had trouble finding anwsers for the js approach on the web as well. Keep in mind that apart from the interface creation code (last chapter), what is done in the compusition tutorial can be done in python as well.

If you intend of doing something real time or which needs an interface with buttons and so, a vst basically, then max would be the way to go I guess. For my part I wanted to do a control surface and get a tighter control on Live interface so max is useless here. It does the same thing as python with a cumbersome syntax, no folders, no type hint, limited libraries, no access to _framework classes or fat control surface code like push, as far as I know .. Python rocks ^^

I don't think you need to explore the js api if you do python stuff, the api is the same (from my understanding). I've just run the api stub script on the 10.1 xml as you did and run inspection on my (Live type hinted) code and everything was found (including of course Live 10 new api properties). So I guess it's fine !

Too bad the  "with self.component_gard()" didn't work I don't know what it can be then. I need to say there is a little bit of a learning curve with python scripts, I had also some trouble to make my first remote script work and then with some concepts like _Framework classes (components, controls), listeners, defering changes. But in the end it all works quite well. Just have a light set with like one track, no plugins, log in the debug log and reload your set non stop to see what's happening ^^

I have a autohotkey binding to a python script which focuses ableton and reloads it bypassing the save dialog so it's like no more than a 2 second job :

import time

import win32com.client

import win32gui

win32gui.SetForegroundWindow(win32gui.FindWindow("Ableton Live Window Class", None))


shell = win32com.client.Dispatch("WScript.Shell")









Cheers !



Hey Thibault, you are helping me a lot, thanks for that!

by now i managed to make a really simple script work (its loading, its "starting" and "stopping" my session via keys)

but how did you manage to make it "work like a charm"? I'm nowhere near that, the generated stub doesn't do anything, there is no funcktionallity correct? if i try 

app = Live.Application.get_application()

i do not get an instance of the application (fixed that by now but it seems to me like i have to do this for everything ?) and therefor i dont see that app.get_major_version() is a function of the app (i know it is and by now its displaying, just trying to explain what i mean, can you follow me?)
this would be really really helpfull as well, the remote debugger, ive got myself a pro license via my university and i did manage to connect to the remote debugger in the example but when i try to add it to my script:

from __future__ import absolute_import, print_function, unicode_literals
from .mdcr import mdcr
import pydevd_pycharm
pydevd_pycharm.settrace('localhost', port=4223, stdoutToServer=True, stderrToServer=True)

def create_instance(c_instance):
return mdcr(c_instance=c_instance)

ableton no longer allows me to assign the script to a controller

thank you for your time and help



Hi again Johannes and happy new year !

Always happy to help you're the only person I spoke seriously about remote scripting since I started haha.

For the stub when I said that it worked I just wanted to say that the autocompletion works ! It's empty as you noticed and is just meant to be dropped on your pycharm python path (add the stub folder to your content roots) so that all Live prefixed type hints are processed and autocompletion works for all the Live module ! It's just for pycharm really, it fetches type definitions and adds autocompletion, nothing gets executed obviously when you are writing code. I took a look at the stub and saw it was just a bunch of python classes but I guess in e.g. javascript that would be typescript interfaces only. Nothing is supposed to get executed. At runtime the python environment is different it's the one bundled in ableton (unfortunately because they still use 2.7 these bastards ^^) and import Live imports the real API.

Now I don't know if you are asking this because you thing about testing. If it's not the following is not gonna be interesting ^^.

Some people (me included) would like to have a real Live Api stub and / or a headless Live but that would be for running tests suites only. I'm not an expert in this but I thing it would be quite a lot of developement on the ableton side and that extensive unit testing is not gonna be a thing in the near future. I don't know how ableton team is managing push remote scripts releases, do they test ? I don't even know but would love to talk to some ableton guy. From other threads regarding the question it seems no one found a real stub or is using one. Of course you could still write one for e.g. pytest tests (I started doing it some weeks ago) but 1. it's a lot of work and 2. As (interesting) remote script programming is mostly asynchronous, that would have to be emulated, but I guess it would mean a headless Live really. Also how can we test clicks on the interface ? It's part of my script because some things just cannot be done via the API (e.g. display an unactivated vst window or grouping a track). 

Only thing I could advise would be to log and throw exceptions a lot whenever you can, maybe post process your log file with colors and stuff (I did something simple here).

Well still regarding testing I managed something that helps a little bit : I created a empty mock of Live, that allows me to unit tests part of the code that doesn't interact directly with Live e.g. utility functions or classes (decorators and such). Like this I can load my script in a jupyter notebook and don't need to reboot live everytime I wanna check that 2+2 equals 4 ^^. the micro stub is just here to to absolutely nothing except not throw errors on method calls. 

Regarding the debugger I need to say you have a lot of ideas I didn't get when I started haha. Never tried to put one in place but also I remember reading in another thread (don't remember where) some guy complaining about the lack of Live stub and that a debugger couldnt' be set so I thought it was a dead end. If there is a way to make this work it would be awesome !

I realise I probably talked about stuff you don't care about but this way of communicating is like sending a telegraph I always want to put more just in case ^^. If you wanna talk more maybe you can add me on facebook ! Anyway I'm afraid this thread is gonna be lost to remote scripts devs as it's not ableton or nativekontrol ..

See you !





"Anyway I'm afraid this thread is gonna be lost to remote scripts devs as it's not ableton or nativekontrol "

Au contraire! :-) I've found you guys be googling and this is one of the more informative threads to find if you're jumping into remote script development.

Having said that: Hi I have just started diving into Python because of knobkraft (sysex librarian) and mainly because I have a Mackie C4 controller which I want to work with in Live 11. I have opened a repository here ( ) and would very much like to get some feedback/help with this. The scripts kind of works but still has issues.

Now with Live 11 they switched to Python 3 and I actually managed to convert most stuff correctly. I also decompiled the Live 11 / Py3 scripts (_framework, Ableton, Mackie, Push), most of them successfully, which are in another repository.

What I need to do for my purpose is written down on GitHub, but my questions for today:

  1. Are framework and v2 classes usable interchangeably? Can I mix or do I need to stick to one?
  2. Vpot_parameter vs deviceparameter (Mackie vs LOM): I mainly use the Mackie Control scripts as reference and within those for everything parameter related they use "vpot_parameter". On the other hand everything that Ableton has written, they use what I would use when checking out the LOM model and that is "Live.DeviceParameter.DeviceParameter.value". I cannot find ANY reference on Ableton scripts w re to vpot_parameter and not ANY reference in Mackie scripts for "deviceparameter". Anybody got an idea why that is?
  3. How do I use/generate the stub when I have an XML and use pycharm? Do I actually need this when I have the framework classes added as root?
  4. I ask all this in the hope of fixing my Problem nr 1 on github (Boost error) . It could also be that fixing this fixes other stuff as well.

Looking forward to your feedback! :-)




Hi Markuss !

Didn't expect any more comment on this feed but that's good the more the merrier ^^ It's true there is not a lot (kind of an understatement) ressources about remote scripting and it's a shame considering how powerful and interesting it is ! I'll try to help you the best I can :)

1. My understanding is that it is 2 versions of the code and serves the same purpose. It seems that ableton v2 is more complete and therefore more recent. Push2 is using it so .. I myself went for the _Framework one as a (famous) script I like is using it but I would maybe change for ableton now if I needed to do it again. Anyway I didn't find it so interesting to use many classes from either because there is no documentation whatsoever and stuff seems sometimes tailor made for push or anyway quite specific. So I'm using their (very good) event system and a few classes and that's all.

2. I know nothing about Mackie control scripts and about vpot. From what I understand (checked the code a bit) it's just a parameter name and could be named anything. It's internal to the script. Live.DeviceParameter.DeviceParameter on the other hand is a lom Class defining a device Parameter (as seen here and here Why you don't see it in the script is because the script access parameters without type hinting the class name. but when doing e.g. "devices_on_track[j].parameters" it very probably gets a list of live parameters (that is List[Live.DeviceParameter.DeviceParamater]).

3. If you already added the decompyled framework classes to your content root then it's almost all good. You can just add the AbletonLive-API-Stub folder as content root (the Live subfolder should be just under like on the github). It is not that important really but it adds autocompletion for the lom methods and stuff from import Live like constants and also type hints. To regenerate the stub for another api version you can just copy the xml file and call it Live.xml then run It creates the Live/ which makes the types understandable for pycharm.

4. Regarding this last question I took a look at your github repo and it's not that easy to see the problem. But the problem you talk about (indexing after a lom change when deleting a device) usually happens when the listeners are not setup correctly or when you keep stale references to objects that were replaced by live internally (e.g. tracks are for example changing underneath when you add or remove them so you need to update your model !). It can happen if you index an array with a stale counter variable of when you do array.index(<stale device reference>) for example. I see there are listeners set so it's hard to know where is the problem. 

EDIT : this last paragraph was complete bullshit from myself I just realized, especially when talking about Live track objects being replaced. Live objects don't change when other objects are added or deleted. That is, if we add a track, other tracks are still valid. Doesn't change my point about keeping stale references to deleted objects though.

The thing with your script is that it is quite verbose and not so easy to grasp at first sight. For all the listener stuff I would use the decorators like @subject_slot in _Framework that are much cleaner that the old add_***_listener syntax ! Also maybe break the file into different parts ? And use type hints if you can :p It helps a lot ! The only thing is you need to install it with pip and then modify your path like so (with your python path of course).

sys.path.insert(0, "C:\Python27\Lib\site-packages")

I did it on my script anyway works like a charm.


See you !


wow, thank you so much for the elaborate answer! I agree, the more the merrier :-)

1. (v2 vs framework): once I start refactoring the code, I'll probably use v2 to be on the safe side. For now, I still need to introduce any kind of delegation to LOM at all and as I am still very new to all this (Digitalization Consultant by day), I am not sure when to jump. My first attempt yesterday failed miserably after 2hrs :D

2. vpot parameter: Basically everything is done by hand in this script, which makes it hard to change and probably inefficient. I expected a vpot_parameter = Live.DeviceParameter.DeviceParamater at some point in the scripts but couldn't find it, which threw me off.

3. Thanks, I actually managed to fix that script for py3, it's now

import re
import sys
import codecs
import xml.etree.ElementTree as ElementTree
from os import makedirs
from os.path import relpath, join, exists, dirname
from StringIO import StringIO # for Python 2
except ImportError:
from io import BytesIO # for Python 3

def main():
script_dir = dirname(__file__)
in_file = relpath(join(script_dir, "Live.xml"))
out_dir = relpath(join(script_dir, "Live"))
out_file = join(out_dir, "")
if not exists(out_dir):
generate(in_file, out_file)
return 0

def generate(in_file, out_file):
xml = parse_xml(read_file(in_file))
with, "w", "utf-8") as f:
f.write("from types import ModuleType\n")
last_tag = None
last_name = None
last_doc = None
for element in xml.findall("./*"):
assert isinstance(element, ElementTree.Element)
if element.tag == "Doc":
last_doc = element.text.strip()
generate_code(last_tag, last_name, last_doc, f)
last_doc = None
last_tag = element.tag
last_name = element.text.strip()
generate_code(last_tag, last_name, last_doc, f)

def generate_code(tag, name, doc, f):
if doc is not None:
doc = doc.replace("&gt;",">").replace("&lt;","<").replace("&amp;gt;",">").replace("&amp;lt;","<").replace("&amp;","&")
if tag is not None and name is not None and name != "Live":
level = name.count(".")-1
indent = " "*level
short_name = name.split(".")[-1]
if "(" in short_name:
short_name = short_name.split("(")[0]

print("Generating %s '%s'" % (tag, name))

if tag == "Module":
f.write("\n\n%sclass %s(ModuleType):\n" % (indent, short_name))

if tag == "Class" or tag == "Sub-Class":
f.write("\n%sclass %s(object):\n" % (indent, short_name))
f.write("%s def __init__(self, *a, *k):\n" % indent)
indent += " "

if tag == "Method":
args, ret, doc = parse_args_from_doc(doc)
if args:
f.write("\n%sdef %s(self, %s):\n" % (indent, short_name, ", ".join([arg[0] for arg in args])))
doc = "%s%s" % (doc, make_arg_doc(args, ret, indent+" "))
f.write("\n%sdef %s(self, *a, *k):\n" % (indent, short_name))

if tag == "Built-In":
args, ret, doc = parse_args_from_doc(doc)
f.write("\n%s@staticmethod\n" % indent)
if args:
f.write("%sdef %s(%s):\n" % (indent, short_name, ", ".join([arg[0] for arg in args])))
doc = "%s%s" % (doc, make_arg_doc(args, ret, indent+" "))
f.write("%sdef %s():\n" % (indent, short_name))

if tag == "Property" or tag == "Value":
f.write("\n%s@property\n" % indent)
f.write("%sdef %s(self):\n" % (indent, short_name))

if doc:
f.write('{0} """\n{0} {1}\n {0}"""\n'.format(indent, doc))
f.write("%s pass\n" % indent)

def parse_args_from_doc(doc):
args = []
if doc and ":" in doc:
parts = doc.split(":", 1)
raw_args = re.sub(r"^.*\( (.*)\) -> *([^ ]+) *$", r"\1, \2", parts[0])
raw_args = raw_args.replace("[", "").replace("]", "").split(", ")
ret = None if raw_args[-1] == "None" else raw_args[-1]
for arg in raw_args[:-1]:
arg_parts = re.split("[()]", arg)
arg_name = arg_parts[2].strip()
arg_type = arg_parts[1].strip()
if arg_name == "self":
arg_name = "handle"
args.append((arg_name, arg_type))
doc = parts[1].strip()
except Exception:

return args, ret, doc

def make_arg_doc(args, ret, indent):
arg_doc = ""
for arg in args:
if "=" in arg[0]:
arg_parts = arg[0].split("=")
arg_doc = "{0}\n{1}:param {2}: {2} defaults to {4} \n{1}:type {2}: {3}".format(arg_doc, indent, arg_parts[0], arg[1], arg_parts[1])
arg_doc = "{0}\n{1}:param {2}: {2}\n{1}:type {2}: {3}".format(arg_doc, indent, arg[0], arg[1])
if ret:
arg_doc = "{0}\n{1}:rtype: {2}".format(arg_doc, indent, ret)
return arg_doc

def read_file(name):
with, "r", "utf-8") as f:

def parse_xml(text):
Create and return a namespace agnostic ElementTree.

:rtype: ElementTree.Element
it = ElementTree.iterparse(BytesIO(text.encode("UTF-8")))
for _, el in it:
if '}' in el.tag:
el.tag = el.tag.split('}', 1)[1] # strip all namespaces
return it.root

if __name__ == "__main__":

Will try it out now. Especially for a complete noob like me, autocomplete of LOM stuff will be very helpful 

4. Thanks for taking a look! The error that the log shows, that it's imho not a listener problem but a LOM change in combination with boost. The only problem I can spot currently is the int vs unsigned int for the first arg. Would you agree? I then of course googled how to do a int->unsigned int in python and that where gave up (for the moment)

RemoteScriptError: ArgumentError
RemoteScriptError: Python argument types in
MidiMap.map_midi_cc_with_feedback_map(int, DeviceParameter, int, int, MapMode, CCFeedbackRule, bool)
did not match C++ signature:
map_midi_cc_with_feedback_map(unsigned int midi_map_handle, class TPyHandle parameter, int midi_channel, int controller_number, enum NRemoteMapperTypes::TControllerMapMode map_mode, class NPythonMidiMap::TCCFeedbackRule feedback_rule, bool avoid_takeover, float sensitivity=1.0)

Oh and it's definitely not "my" script! Someone named "Leigh Hunt" did it initially in 2008 in py2.5 for Live8, and Jon massively worked on it. I only did the py2 to py3 conversion and a couple of other things (some LOM stuff, fixing a couple of issues and commenting). Will try to bring to type of code into the present but I am still only at tuples etc in my python for beginners course :D :D


Haha yeah I see !

Regarding your Python argument types issue I can't really venture a guess to why it fails but you're right it's probably not a listener problem. Unsigned int is a C / C++ concept and does not exist in python (it's just an int in python). You could maybe ensure it's positive but I don't think it's that. You probably know there is the expected python signature (along with the C++ one) of the method in the website xml like here for live 11 : 

What I would advise would be to log the type of each variable you pass to the function and see if it seems to correspond. I never used this part of the API so ..

It's normal to spend an substantial amount of hours especially if you don't know python ! I've spent a ridiculous amount of time on some stuff and I've worked with python before ^^. Regarding delegation to the lom what I personnaly did is creating my own Track / Clip and so on classes. A bit of work (and you need to know a bit about object oriented programming) but you'll learn a lot and it makes the code so much clearer ! but it could be overkill for simpler stuff. 

Good for the stub script ! I'm not sure it matters a lot to run it with python2 or 3 though because it only generates a simple python file.

See you !


Thanks for taking the time to answer and your valuable ideas!

Yes, I am aware of the xml file, in fact, as I managed to implement the stub as source root, pycharm even shows me stuff from that file/LOM when hovering. The problem is: I still don't understand it :D


"What I would advise would be to log the type of each variable you pass to the function and see if it seems to correspond."

After messing around with the log (setting it to debug, trying adding "type", lot of googling) I can't get the log to show the stuff I want. Could you help me out?

This is what I currently got:

Live.MidiMap.map_midi_cc_with_feedback_map(midi_map_handle, param, 0, encoder, Live.MidiMap.MapMode.relative_signed_bit, feedback_rule, needs_takeover)
self.main_script().log_message("potIndex<{}> cc_value<{}> received".format(type, type, encoder, param))




Haha it will come ! (the understanding ^^)

Do you mean the log doesn't show ? I see your script does That should work. 

Hard to tell with 2 lines of code what you're logging but what you want is the class name of the object. Usually by logging it it show its class. So just log_message(<your_object>). Then you can maybe log (type(<object>)) to have additionnal info and to get the class name you can see this :

So before calling your method log all of this for each parameter and check the signature in the docs.

Also be aware that you can quick test things in the python interpreter or even better in a jupyter notebook (very easy to setup). Like this you don't need to relaunch 5 times your script if you just want to find out the right info to log of an object ^^. I'm doing both extensively when I want to test "python" stuff. Testing your script especially when it interacts a lof with the live api is a bit more complicated.

cheers !



ps : I saw that your script does not extend framework class (specifically not ControlSurface, neither ControlSurfaceComponent, either from _Framework or v2). I'm suprised as I thought it was necessary. It must be legacy behavior.


see Thibault, with an remote debugger you could test your script attached to live, thats what i was aiming for all the time but i still cant manage it to work, i hoped with python3 it would work but now there is a missing built-in when using the pydevd and various other debugger modules, rpdb worked once (idk why, but it doesnt seem to be more of use for me than the log file, i cant connect it to pycharm)

this is my first python project and with my main language beeing c++ i havent had such a problem and dont know how to fix it, as far as i understood there is a python interpreter compiled into the ableton.exe and an internal is missing? (is there someone skilled in python reading this thread? could you help? i think were all new to python right?) can i sideload this? can i tell my code to look somewhere else for "msvcrt"




hello all. I had kind of forgotten about starting this PyCharm community thread.  But Markus found it and told/reminded me of its existence.  The questions I had in the first post a year ago were nearly entirely due to my ignorance about Python generally and Live remote script programming in particular.  I think I started it before I had even "met" Markus online and learned about this (Leigh's) C4 starter-script.  I have joined the C4 github project, but I don't have Live 11 yet.  So, I will be keeping the "backward compatibility" torch burning until I upgrade.  And, I haven't read all of the posts since November, so many words LOL.  I have a friend who knows way more Python than I do help me get going, but he also hit the "runtime interpreter is built into Live exe" wall.

One question I still have is what kind of support does Ableton offer to hardware manufacturers who have a new midi device in development and are developing a companion "remote script" so the new device will work in Live?  Do hardware manufacturers also have to do "log-file-tail debugging" while developing their scripts?  Or, does Ableton offer any documentation to hardware developers that explains "remote script development from scratch" (one documenting the "old school" way, and another documenting the "v2 way")?


// This string only contains two {} "placeholder symbols", 
"potIndex<{}> cc_value<{}> received"

// but the code is telling .format() there should be
// four placeholders available
.format(type, type, encoder, param)

// edit: because I'm remembering the name "param" from the script,
// I think it might be a 'tuple' containing more than
// one value which could throw off what you think is the correct
// "placeholder count". Any of those names could represent
// tuples. I suspect "type, type" is a typo here and not actually
// in the code. But that could also throw off expectations.


you should check out clyphx pro and especially clyphx pro user actions.

I am an owner of several NativeKontrol Live scripting products, a few "Arsenal scripts" for older devices like the APC40 mark 1 and BCR2000, clyphx pro, and their Push2 script.  That is probably the main reason I have not updated to Live 11 yet.  The Arsenal Push2 script in particular stopped working with the Live 10.1.28 update (I was getting automatic updates from Ableton and I used the Arsenal Push2 script a lot. So I am quite aware of when it broke. :)), and that loss made me gun-shy about updating to 11.  I'm especially hesitant because my loss of the Arsenal Push2 script functionality is coupled with the fact that there has been no update from the "site owner" Stray on the forum at NativeKontrol since May or June 2020.  Rumor exists that Stray now works on the inside at Ableton so great things are coming.  But the timing of such a complete disappearance in the middle of 2020 is equally as reliable a prediction as any rumor to me, so who knows.

Thanks everyone for all the details you've shared.  I need to (re?)watch some of Stray's videos too it looks like. and I agree the C4 remote script currently uses a "poor practice" of keeping track of changing data via "deep array indexing", although I don't want to cast any aspersions on the original author.  I have personally written much worse, but nobody is reviewing any code I was writing x years ago.  If anything, I applaud the original author for posting/sharing at all.  With that said, I agree it's verbose and not easy to follow, and I probably contributed to the verbosity when I was "humanizing" the "original" decompiler output and trying to figure stuff out.  That's kind of where my steam ran out last summer, I was trying to get to an "object oriented" understanding of what that "database" of array dimensions should look like (to be more immediately comprehensible to me).  But perhaps "modular" would be a better way to describe the Python way.  I started "contracting" with an out-of-work guitar player friend of mine for "recording studio services" about that time and I switched gears from "script programmer" to "script user" to facilitate working to support that "contract".  I need to have "song projects" ready for a guitar player to add to before I need a studio guitar player, right? LoL, don't take that too literally.  I'm no "talented songwriter" waiting to be discovered.  But I wanted to help a friend, and I really like playing around in/with Ableton Live.

One point in common on this thread seems to be that you can write "Python" modules that tightly integrate with specifically written C/C++ modules, and you can also write "Max external" modules in C/C++ that work with "Max 4 Live".  So theoretically, one could get the best of both worlds, or all three worlds.  Could one perhaps build a "max external" module in C++ that interacts with a running instance of Live (by dropping a "max 4 Live" device that uses the custom "external" module on a track?) such that the "external module / max 4 Live device" acts somewhat like a "debugger" Python Remote Script projects could attach to?  I suspect the simple answer is yes, while the true answer is much more complicated. and the real question would be is that something everyone on this thread would be interested in pursuing, particularly Johannes Hitzinger who would probably represent the C++ nexus executing such a plan.  And it's not a plan, just an idea.  I think Stray did a lot of Python "remote scripting" stuff that is "public domain", and he also did custom C++ modules that remain private IP.  I suspect the Arsenal scripts that stopped working for me after Live was updated to 10.1.28 stopped because of something in such private IP, maybe just some settings when compiled.  I posted the exact error message logged by Live over on the NativeKontrol forum, but no response.


Hey Jon, thanks or your insight

your c++ idea might be heading into a good direction, i already tried to solve my problems with ableton via m4l and c++ using this and managed to write some test stuff, debugging works just fine with visual studio, if anyone is interested in using this ask away, i can help! but creating a debugger this way thats working with pycharm isnt feasable in my eyes, i think ill admit defeat on the debugging side and accept the fact that we have to do "log-file-tail debugging"

i was dicussing the manufacturers thing with my roommate some days ago, i think the sized down python interpreter is a simple compiler option, they can simply deploy a "developer version" of ableton with the full python interpreter, i dont see why they wouldnt do this for manufacturers and considering the post on the ableton forum, it seems like they once didnt do this optimisation 


Hello everyone !

This post is starting to get some heat it seems haha. That's great maybe we can find a solution together !

Johannes Hitzinger you are asking yourself exactly the same question as me with msvcrt ^^. I gather you're on windows ? Are you the author of this post ? :D. I finally created a post on ableton forum here regarding the same issue :

Second problem is that Live 11 refreshes scripts only on hard restart which is very boring as well. 

Regarding msvcrt and module injection I spent a few hours trying to find a fix with no success. msvcrt is a builtin module and as such I don't believe you can just sys.path.append() and call it a day. There is no that you can overwrite or something it is builtin. + this kind of module seem platform dependant and I mean C++ / windows knowledge I have is non existent so .. And it looks like msvcrt is not the only module that is missing..

I worked with python before but this is very specific build knowledge of python. I guess the guys who shipped the python build would know how to go about this but they don't seem to hang on the ableton forum. For me it's a dead end for now. I'll maybe try to contact ableton support later. Let's tell each other if we find a solution !

Jon, regarding native kontrol scripts I don't think it is impossible to migrate scripts. I see in clyphx pro there is some use of subprocess but it does not seem blocking. My guess is that stray is just too busy to do it or maybe it's an agreement with Ableton ..?

Also your idea is interesting to use max for live to do overcome python problems. But it seems quite complicated to me as I know little of max for live and remote scripting is already hard ! I'm going to stay on live 10 for now at least while I'm working on my script :)

And lastly jon for your question about remote script dev I'm not so sure hardware manufacturers are using a different method (log and see ^^). I never saw stray talk about another method on his videos neither in the clyphx pro code. Still they must have so documents to share to manufacturers for sure, that would be great if it exists and we can grab a copy ! Must be quite internal though are there is no support for remote scripting from Ableton.


yea thats my post ^^ ive also tried to contact the dev team on twitter via pm but they didnt respond and i dont believe they will. i remember reading somewhere they dont offer documentation because they would have to support it and they cant 

i took a look at your post on the ableton forum, are you saying you are not able to get your previous script working with ableton11 ? this will be a problem for me as well i think, maybe we have to solve this another way, do you guys feel like pooling resource for creating a sort of external api for this? maybe we can do a rpc style api? i really like this approach because then everyone can use the language of their choice for coding the fun stuff (im looking to give rust a go, no runtime errors sounds really nice for my live performance rig)
however this assumes we can open some kind of network communication with the crippled python interpreter ableton provides

it would also be nice to hear what the "professional custom script community" is doing at the moment, i do own clyphxpro and am wondering if i can use it with live 11 anytime soon





As you must have seen from my code, I have very little knowledge, but even I managed to convert a very old script to py3 so that Live recognizes it. So basically take the old script, decompile it and then either use 2to3 or manually replace long int with int, xrange with range, delete the "u" s, and what I also did is replace / with // (depends on the age of the script). That's it. I'm convinced you guys can do that 😊


And yep, live 11 needs to be fully restarted, together recognizing new scripts (unless we find another way) 🤔


adjusting the script isnt the problem im talking about, thibault isnt able to launch any other executable as far as i can tell, i will defenitly need to do this and would also like to use ableton11 

ill see tomorrow if i can run the xmlrpc server in a script and see from there, wort case i abandon ableton completely and give it a run with vst only, well see, not sure for now


I should have quoted 😊 my reply was w re to nativecontrol and clyphx


Question: What is the better approach given my current knowledge?

1. trying to fix everything based on the current state and THEN introduce all the new stuff


2. refactor to properly use the new stuff (LOM using _framework or v2, decorators etc) and breaking the somewhat working code completely in the mean time and THEN fixing it?


@johannes this guys seems to have a working rpdb implementation (py2.7)


Wow.  This thread did heat up. So many responses to make, where to start...

@Johannes  I know about that "Max API" repo and I'm super interested in pursuing development of an "external" that would work with/as a "max 4 Live" device (related to the Mackie C4 device(s) both @Markus and I are using). 

At a high level, one way to implement my idea would be a C++ external where the C++ modules virtually host a custom JVM that "natively" runs a legacy Java 1.4 app, the "C4 Commander" app Mackie developed to accompany the C4 device on the CD in the box.  The C++ modules would handle all of the "JVM I/O" so the "C4 Commander" app would just happily, normally run in the "1.4 JVM" container, and the C++ modules could interpret (as needed) all of the (associated) "JVM I/O" and turn the data into "max (4 Live?)" device data and/or pass the "midi sysex data" directly to/from the "C4 associated" midi port of the host PC.

At another high level, one could avoid hosting a custom "JVM container" and "emulate" all of the existing C4 Commander functionality directly in C++ or some other compatible language (...and the rest.) 

But I don't know, necessarily, which route or some other might be the best option to pursue first.  I'm pretty sure the "emulate in C++" route would involve the most work on my part, but only because I suspect, for example, that a "boost or JUCE library" or two might already exist that greatly simplifies the work involved to "host a lightweight 1.4 JVM" container inside my own custom C++ module.

Visual Studio and C++ were almost all I used in college.  But on the job since, I have worked in/with other languages mostly C# (with Visual Studio) and Java/Groovy (with several Java IDEs, lately Intellij). So it has been a while.  Today, I have a full JetBrains toolbox subscription, so I guess I would prefer to work in their C++ IDE (CLion?). Although I've heard "Visual Studio Code" is a free and versatile alternative for many purposes, I've become accustomed to the "JetBrains keyboard shortcuts" that are common across their IDE platforms.  The same Ctrl+Alt+b shortcut does the same action (jump to definition) in Intellij and in PyCharm, for example. Similarly, Shift+F6 initiates "refactor/rename..." in both/all JetBrains IDEs (by default) and I like that commonality...  I kind of remember that cycling74 only supports Visual Studio configurations.

The above imagined "max 4 Live" device (including all necessary pieces not specifically mentioned above) would interact with the Midi Remote Script for the C4 device that Markus is hosting (and I are the contributors), or a fork, in the sense that the remote script would defer "assigning actions to controller events" via the "encoder layouts hard coded in the python" to the "encoder layouts" generated in/by the "C4 Commander" app (or emulation) in the m4l device.  (C4 Commander represents "instrument definition" and "device layout" as defined XML documents with .c4i (instruments) or .c4l (layouts) extensions.)  (Markus's C4 repo script currently only has two functioning "modes" (encoder layouts), although in a sense the script's "device mode" dynamically generates its "encoder layouts" on the fly depending on which device (on the current track) is selected.  The mode is also static in the sense that whatever the first 24 "automation parameters" exposed (by the Live device) are, those parameters get mapped in that order to the "Bank 1" page, and the next 24 to the "Bank 2 page", and so on.  By separating the "device definition" from the "controller layout" as separate XML like that, a user becomes free to map their favorite "24 parameters" to "bank 1 of the layout" regardless of (but still respecting) the order that Live or a device expects.  In the native "C4 Commander" world, if you had 4 different hardware synths (with .c4i instrument definitions available), you could make a layout where, say, each of the 4 rows of 8 encoders on the C4 could be mapped to "the same 8 parameters" (Volume, Filter Cutoff, Filter Res, etc) on each of the hardware synths (using all the info stored in each of the .c4i instrument definitions.)  ...and you could make layouts dedicated to each synth, and you could mix and match to "split" the device layouts at runtime where, for example, one row of encoders uses XML configuration from the "selected layout" (among the "loaded layouts") and the other three rows of encoders use XML configuration from the N + 1 loaded layout.

Thibaultlebrun  I agree.  It's not a showstopper, and hopefully stray is still around supporting NativeKontrol in the near future.  I've heard Ableton NDAs are extremely tight controls, so I'm pretty sure if such a "3rd-party-developer-Python-debug-interpreter" version of Live exists, the general public would not know or even hear about it.  Yes, I'm sure Ableton doesn't deeply support users who do "remote script development" because they couldn't afford that kind of support for such a small percentage of their overall user base.  I can imagine Ableton having had to spend more than annually budgeted simply supporting "certain" hardware vendors and remote scripting in enough years to make them gunshy about "officially" supporting regular customers and remote scripting ever.

I'm not very interested in opening another can of worms to compete for my free time associated with "fixing my NativeKontrol scripts" myself.  Until/unless I hear official news that further support of the NativeKontrol products I already own has been abandoned, I don't see my level of interest changing there. I like stray's work. So if he is on the inside and he does have influence on the direction of the programmatic support of "remote scripting", then I believe good things are coming indeed.  Who knows, maybe 11.1 or 11.2 will be almost entirely about implementing stray's remote scripting integration ideas, and we are all kind of wasting our time right now because when his ideas hit "production" we're all going to want to start leveraging his "v3 model" with our scripts right away.   I'm also not champing at the bit to "get clyphx back".  Maybe if I had used it more, I would miss it more.  I sure miss the Push2 script though, and I didn't upgrade, but the Arsenal Push2 script still died after 10.1.28 released.

You are correct. Python is hard enough (at the moment).  

Regarding delegation to the lom what I personally did is creating my own Track / Clip and so on classes. A bit of work (and you need to know a bit about object oriented programming) but you'll learn a lot and it makes the code so much clearer

Would you be willing to share your classes?  Do you already have a repo?  We may not be able to directly use your classes without starting the C4 script over from scratch, but you are correct.  I'm sure we could learn a lot about how to possibly organize the script classes differently and how to "communicate with the LOM" from Python. Although I think Markus is getting a handle on communicating with the LOM pretty quickly, I am definitely interested in "code review" research on what exactly is a Python "object oriented" approach to writing a remote script. Part of what I still don't understand about Python generally is how does "object oriented" work when objects don't get to define a "privacy boundary" that the language enforces (without reflection) and "everything" seems to be available globally anyway.

Johannes Hitzinger  I haven't yet necessarily understood the thread(s) you all linked to, so I'm not sure what "this" in "external api for this" actually means, and I understand the utility of passing around "plain text" (JSON and/or XML encoded), but wouldn't all sides need to agree on the exact "JSON RPC" schema to exchange and then maintain some form of versioning the schema through changes over time?   An RPC API seems like a lot of investment for negligible return for a remote script project like the C4 script repo Markus and I are contributing to. I definitely get the utility of "attaching" to a process so you can understand what's going on with its internals while it is running. But we're just trying to get 2 of 4 "modes" simply working reliably and introducing a few new features, right now and I can't really even promise regular attention to that project.  In addition to all the time that "the real world" requires, my (limited) personal time focus drifts back and forth between programming on tools to help me make music and actually making music in the studio I've built.  I know I just got done expressing a lot of enthusiasm for a C++ max external project and I do entertain such enthusiasm.  However, it is also true that there isn't enough (free) time to devote to all of my various enthusiasms.  My "number one focus" kind of jumps around from "high priority" to "high priority".

creating a debugger this way thats working with pycharm isnt feasable in my eyes

Maybe not a "real" debugger, but for example,perhaps a utility that allows a Python Remote Script to log to the Max console instead of (or in addition to) the Live log using simple Python import syntax would be a place to start.  I'm also thinking a kind of "browse-able test harness" of Live itself (the LOM) could be useful.  In the sense that like there is no "debugger" per se for the avionics in an airplane cockpit, engineers need to build "test harnesses" around all the avionics I/O systems.  Such a test harness of Live might only browser-like report the status of the LOM, how many audio and midi tracks, and Instrument and Effect devices are loaded, which are selected, what parameters are selected, what events are firing, etc.  I imagine that kind of real-time feedback could help Python script developers when "debugging" their own code.  They wouldn't be stopping "Python bytecode" in its tracks like a true debugger, but developers would have a chance to "drill into" the LOM status and see what it "looks like" right before they do the thing that makes their script go crazy (and then after of course, but that's much less useful information usually)

Markus Schloesser We talk a lot over on GitHub, and I'm definitely interested in the other opinions here too.  An approach I would recommend is to not necessarily rigidly stick to one way or the other but blend the two.  Fix enough of the current workings to be sure the part you want to "convert to LOM model" is "stable enough", and then go for it.  Even if we end up totally abandoning the original script and starting from a clean slate to facilitate using the LOM more effectively, you will be learning things along the way in the meantime.  Things you would still need to learn, for example if we totally started over right now.  I mean, in a sense it's just easier to "stand on the shoulders of giants" to reach a certain altitude than to get there all by yourself. (That's not a pejorative phrase in the sense that everyone alive today stands on the shoulders of Maxwell, Euler, Copernicus, and thousands of others.  Literally standing on someone's shoulders to save yourself while they drown is also a "way to gain altitude")

to convert a very old script to py3 so that Live recognizes it. ...take the old script, ... manually replace long int with int, xrange with range, delete the "u" s, and maybe replace / with // (depends on the age of the script). [quote edited]

I will probably be "reversing" some of these steps (as needed) among others in order to make your/our repo script "Live 10" compatible (again).


Wow again.  I wrote a lot. (luckily, I'm in the habit of saving my "long time to write" posts to the clipboard before submitting on any web site.  I would have lost all of this post.  I got an "oops, 404" page the first time I tried to submit...)


well i would be shocked if you cant do the same development in clion since the min-devik uses cmake anyways, you should be free to choose whatever ide you prefer!

as to your idea, using a vm sounds like a simple but overkill way of solving this, maybe you can compile the java application to wasm and then host this in c++, its basicly the same but cutting out the whole vm thing which would be good for performance, but there is no way of telling if this performance gain is even relevant without trying both solutions, so making it work with your prefered solution first is the way to go! if you need any help feel free to reach out

"this" refers to the lom api/other usefull api stuff for controllers.
yes we certainly would have to agree on a scheme if we were to do this, but it would be pretty close to the lom documentation i think. after doing this we'd all have a "usable base" to iterate on, maybe ill grab thiabaults lom implementation (can you tell me a little about how "complete" you think it is thiabault?), upgrade it for ableton11 and maybe add some form of debugger and remote option, im having my last test for the semester tomorrow and will have more time after that. if we have testing in mind while designing this "ableton custom script base" we can make everything testable without ableton running and if we manage to document our script base, we might be able to give more people acess to the amazing possibilitys of custom remote scripts