I would like to use a modelica model, which is based on an external library, for a co-simulation in another environment like ANSYS, Abaqus, etc. The model should be able to interact with another model. What is the easiest way to co-simulate a modelica model?
For this purpose, the FMU export in openmodelica seems to be the right approach. The problem is that the model is based on an external library and therefore different issues occur (also stated here: https://openmodelica.org/forum/default-topic/2180-libraries-not-included-in-fmu).
In my case, I tried loading the fmu with the FMPY gui (python -m fmpy.gui) which resulted in the error:
"Failed to load path/to/haeger_model_win64.fmu. The unit 'ml' of variable 'C_a.V' is not defined."
What works for me now: loading the .mo file with OMPython and simulation in Python. That seems difficult though because I do not know how to interact but only simulate for preset parameters.
You can find the model (haeger_model.mo), the external library (HumanLib.mo) and FMU exports of haeger_model here: https://github.com/xi2pi/LPModelica
The simulation with OMPython works just like explained in the tutorial (https://www.openmodelica.org/doc/OpenModelicaUsersGuide/latest/ompython.html).
The problem is that I do not know how to interact with the model when simulating in Python. An approach with an FMU is preferred.
Related
I am working on a python desktop app. This app does some predictions. Right now I train my sklearn model using python script, save the parameters of the model as a dictionary in a yaml file. Then, I build in this yaml into my python app. Then, when I am using the app, the model is recreated using parameters from the dictionary. I realized, that people who have a different version of sklearn get an error. I tried to save my model in a pickle file, but in this case, it produced some warning when app was running on a machine with a different version of sklearn.
There is no guarantee that a given sklearn model would be compatible between versions of sklearn. Indeed, the implementation or the internal API may change between versions. See more informations here.
If you consider one version, the best way is indeed to pickle, and not to save the parameters in a yaml file. It's even better to use joblib to do so. See more informations here.
I realized, that people who have a different version of sklearn get an error.
In this case, create isolated Python environments using virtualenvs
Alternatively you can just generate a Python code from a trained model. This way you eliminate any possibility of object incompatibility. Here is a tool that can help with that https://github.com/BayesWitnesses/m2cgen
I am using an open-source Matlab toolbox for brain-computer interface (BCI). I want to send the brain imaging data over to Tensorflow for classification and get the results back to Matlab. Is there any way to pass data structures from Matlab to Tensorflow and get the results back into Matlab?
In case someone lands here with a similar question, I'd like to suggest a Matlab package I am currently writing. It's called tensorflow.m and it's available on GitHub. There's no stable release yet, but simple functionality like importing a frozen graph and running an inference is already possible (see the examples) - this is all you'd need to classify the images in Matlab (only).
The advantage is that you don't need any expensive toolbox nor a Python/Tensorflow installation on your machine. The Python interface of Matlab also seems to be rather adventurous, while tensorflow.m is pure Matlab/C.
I'd be glad if the package can be of use for someone looking for similar solutions; even more so, in case you extend/implement something and open a PR.
So far the best way I found is to run your python module in matlab through matlab's now built-in mechanism for connecting to python:
I wrote my python script in a .py file and in there I imported tensorflow and used it in different functions. You can then return the results to matlab by calling
results = py.myModule.myFunction(arg1,arg2,...,argN)
More detailed instructions for calling user-defined python modules in matlab could be found in the following link:
http://www.mathworks.com/help/matlab/matlab_external/call-user-defined-custom-module.html
While I admire, and am somewhat baffled by, the documentation's commitment to mediating everything related to TensorFlow Serving through Bazel, my understanding of it is tenuous at best. I'd like to minimize my interaction with it.
I'm implementing my own TF Serving server by adapting code from the Inception + TF Serving tutorial. I find the BUILD files intimidating enough as it is, and rather than slogging through a lengthy debugging process, I decided to simply edit BUILD to refer to the .cc file, in lieu of also building the python stuff which (as I understand it?) isn't strictly necessary.
However, my functional installation of TF Serving can't be imported into python. With normal TensorFlow you build a .whl file and install it that way; is there something similar you can do with TF Serving? That way I could keep the construction and exporting of models in the realm of the friendly python interactive shell rather than editing it, crossing all available fingers, building in bazel, and then /bazel-bin/path/running/whatever.
Simply adding the directory to my PYTHONPATH has so far been unsuccessful.
Thanks!
You are close, you need to update the environment as they do in this script
.../serving/bazel-bin/tensorflow_serving/example/mnist_export
I printed out the environment update, did it manually
export PYTHONPATH=...
then I was able to import tensorflow_serving
what is in your opinion the best way to get data (measure date for example) into modelica (dymola)? Is it possible, to import data from python to modelica (for example into a combi-time-table)?
My idea would be as follows:
pre processing of measured data in python
load the data from python into modelica (comi-time-table)
rund simulation studies (scripted in python)
I would appreciate any suggestions.
That's probably a matter of opinion. But since you have to do much of your data post- and preprocessing in Python I would definitely export my (plant) model from Dymola as a co-simulation FMU and run it in Python.
In Dymola you can export FMU's and 'execute' them on the same pc that holds the Dymola license file. If you need to run the FMU on another pc you'll have to buy a special binary export license.
There is a free Python package called PyFMI (www.pyfmi.org) which makes it easy to run an FMU in Python. See the examples at http://www.jmodelica.org/page/4924.
PyFMI can be a bit tricky to get up and running (with the right Python package dependencies and so on). So if you are not an experienced Python user I would suggest that you download the installer for JModelica.org which will do much the setting up for you.
Best regards,
Rene Just Nielsen
As Rene Just Nielsen pointed out, this is primarily opinion based.
To give another way of accomplishing your goal, try the DymolaInterface. You could either set the table-parameter via the Interface in python, our use a .txt-file which you create and alter in python, and Modelica just knows the path to the file. The interface comes with your Dymola installation under Modelica\Library\python_interface\dymola.egg where you will also find documentation for the functions.
Another python-package for using FMU's is FMPy. I image both FMPy and PyFMI have their pros and cons.
The last option which does not require any external python package would be to use mos-files to execute simulations and use the .txt-files to read in the data. If the task you described is the only thing you want to accomplish, mos-scripts are quite sufficient.
My research group has developed python code for a new building component that we would like to co-simulate with EnergyPlus. For reuseability and market impact we would like this connection to be as easy as possible for inexperienced users and believe packing the model using the Function Mockup Interface Standard (FMI) to be the best option.
We have explored JModelica to test other Functional Mockup Units (FMUs) but found that it does not do FMU export of Python code for model exchange or co-simulation.
I was curious if there any methods for packaging python code as an FMU? If there is not, is there another way of linking Python to the FMI standard or connecting it to a building energy software like EnergyPlus?
The alternative to this would be exporting the building energy model as an FMU and import into modelica/jmodelica or using BCVTB. Although this would work for us we worry it would make it too difficult for inexperienced individuals to use our tool.
Just to be clear, JModelica does support export and import of both ME and CS FMUs and supports versions 1.0 and 2.0 of the FMI standard. I assume you mean that JModelica does not support export of Python code as FMU. I am not aware of any such solution. If you do not find a way of packaging the Python code into an FMU, perhaps setting up a proxy FMU that communicates with your Python code would work?