I should create a python script for Blender that imports obj and exports obj removing all texture sections.
I never used blender, is there a good tutorial for scratch the surface and achieve this simple task?
thank you
The blender python api website includes several tutorials to learn using python in blender. You can find details of the obj importer and exporter options. The obj import/export is handled by a python addon that is included with blender.
You could start with something as simple as -
import bpy,os
for f in os.listdir('/path/to/use'):
bpy.ops.import_scene.obj(filepath=f)
bpy.ops.export_scene.obj(filepath=f,use_materials=False)
If you need more help, blender has it's own SE site that is better for blender specific scripting.
Related
I'm trying to make a python GUI application that handles some sort of data. Then I expect to make the user capable of manipulating that data using python scripts, form within the GUI, using a script interpreter with the data exposed as pre-existing objects. Pretty much like VBA is embedded in MSWord or the way you can embed python on a C application (see here).
Is there any technique or library to do this?
Has this been achieved in some project before?
One way to do it would be to write a GUI in PyQt, and then embed an iPython console inside the GUI as a GUI widget.
Check out this answer:
Embedding IPython Qt console in a PyQt application
and a couple other suggestions here:
https://github.com/ipython/ipython/issues/9508
A different non-PyQt approach is described here:
https://www.pythoncentral.io/embed-interactive-python-interpreter-console/
Does Blender use OpenGl or DirectX? Or is it all done from scratch?
You can look at the blender source code and see it's written in both python and C/C++ -- less python, more C. OpenGL is referenced frequently in the code, while DirectX only rarely. So there ya go.
Does Blender use OpenGl or DirectX?
All graphics output of Blender is done using OpenGL.
Or does it use a programming language (python?) to do everything from scratch?
Why "or"? An API doesn't substitute a programming language. Blender has been programmed in C, C++ and Python. OpenGL is used to render everthing on screen, including the user interface.
Expanding on what datenwolf said. Blender for the majority was written in C, the Game Engine was written in C++ and the entire application has Python bindings (meaning you can use python within the application). Blender uses OpenGL and has a special engine (comprised of opengl calls and functionality, mostly legacy but pushing to use modern stuff, vbos etc) that is used to draw the interface and power its 3d capabilities called GHOST.
For such questions I found ohloh to be useful. It is a site which generates statistics on open source projects. One of the statistics is a list of different programming languages used in the project. You can look at the statistics for Blender here.
ohloh is also useful for identifying the tools a project used and/or to compare to similar projects.
CLI applications like Vim, cdargs can open a new screen then allow users to draw characters there. I'd like to try making some tools like that with Node and was told that it was called "alternative screen"(?).
I'm familiar with JS. But if there's not a solution, Python maybe(..) OK. Want some code example please :)
Found a modules according to tMC's anwser, trying to figure out...
https://github.com/mscdex/node-ncurses
Vim uses the ncurses library to do drawing. You have already found a Node.JS bindings, so you should have no trouble using this library within your application.
I propose you to read the NCURSES tutorial and attempt to do it in Node.JS.
I am using QGIS to do some repetitive mapping work. I have a floor plan of an elderly home which is digitized into QGIS with the bed numbers properly labeled, and a spreadsheet with the bed numbers and all the other attributes that belong to that bed.
I need to create multiple layers and visualize them one by one, currently I am using the 'save as image' function. It's OK if I only need to work on it once however I have >30 elderly homes, and 4-5 layers for each home to visualize. QGIS is already a lot better than ArcGIS, but I still feel a bit overwhelmed when I realize that I need to do them all manually.
I am looking to Python for automation, but seems it is mainly used in QGIS for creating plugins.
Being an R user I am used to automating all repetitive tasks.
I know that QGIS is written using Qt4, Does anyone have knowledge of a QT4 script that I can use as a model to automate QGIS?
Can anyone tell me whether it is possible, and if yes, how?
Thanks.
If you just want to execute a script, have a look at the Python Console (Plugins->Python Console).
Also consider writing a QGIS Python plugin. It's really easy.
Besides these two options, you can also use qgis as a Python library completely outside of QGIS (e.g. as a commandline script) - but I don't know if that's what you're looking for. The excellent PyQIS cookbook calls this "Python Applications"
http://www.qgis.org/pyqgis-cookbook/intro.html
It's all open source, so if you look to the extensive QGIS Python plugin repositories, you can simply look for a plugin that does similar things like you have in mind and use that code as a template.
Is it possible to embed a 3-D editor inside my wxPython application? (I'm thinking Blender, but other suggestions are welcome.)
My application opens a wxPython window, and I want to have a 3-D editor inside of it. Of course, I want my program and the 3-D editor to interact with each other.
Possible? How?
Blender has python plugins, you can write a plugin to interract with your program.
I second Luper Rouch's idea of Blender plugins. But if you must have your own window you need to fork Blender. Take a look at makehuman project. It used to have Blender as a platform. (I'm not sure but I think they have a different infrastructure now)
For Blender specifically, I doubt it. Blender uses a custom UI based on OpenGL, and I'm not sure you can force it to use a pre-existing window. I suggest browsing the code of "Ghost", which is Blender's custom adaption layer (responsible for interacting with the OS for UI purposes).
Perhaps this script might provide some context for your project. It integrates Blender, ActiveX, and wxPython.
Caveat: Windows only.
For Blender2.5 on linux you can use gtk.Socket, code example is here on pastebin