Trying to run a Brownie Dapp in Docker - python

I'm trying to run my brownie dapp in a docker container, but i always have the same error and can't seem to fix it:
FileNotFoundError: [Errno 2] No such file or directory: 'C:/Users/username/.brownie/packages/OpenZeppelin/openzeppelin-> > contracts#4.4.2/contracts/access/Ownable.sol'
I cloned the OpenZepplin folder with all the contracts to the work folder of my project and then remapped it like this:
dependencies:
- OpenZeppelin/openzeppelin-contracts#4.4.2
compiler:
solc:
remappings:
- "OpenZeppelin/openzeppelin-contracts#4.4.2/ = ./OpenZeppelin/openzeppelin-contracts#4.4.2/"
But it still gives the same error, and when I compile the contract, it is clearly still using the C:/Users/username/.brownie/packages/OpenZeppelin/openzeppelin-contracts#4.4.2/contracts folder. I know this because I tried to compile the contracts with a remapping to a non-existing folder:
remappings:
- "OpenZeppelin/openzeppelin-contracts#4.4.2/ = ./nothing/"
and it still compiled perfectly.
Just wondering if anybody could help me with this, and thank you in advance!

Ok, I just had the IDENTICAL issue as yourself. (I cloned a project and tried to run tests on my local machine, but got the same FileNotFoundError:)
But I managed to solve it.
First thing you need to have is Brownie itself (Obviously).
Second thing is installed OpenZeppelin packages inside Brownie.
Here is a link on how to set that up: #OpenZeppelin for Brownie
But in short, you just need this command:
brownie pm install OpenZeppelin/openzeppelin-contracts#4.4.2
MAKE SURE IT'S THE VERSION OF #OpenZeppelin YOU NEED !!
3.Final step is to go into your project's build/contracts directory and delete ALL smartContract.json files and then do brownie compile anew.
After that you can run test etc, it should work.
IN DEPT EXPLANATION:
The reason we were getting that error was because Brownie uses smartCotnract.json build files to run the tests, and to locate all of its own dependencies/packages. Those .json files also contain paths to all of it (including #OpenZeppelin package), and are generated during compile phase only if they don't already exist or the code had been edited in a specific contract. (That is why you need to delete them if you want for brownie compile to generate new ones)
Therefor if I compile it on my local machine and push those smartContract.json files to git, and you happen to clone all of it and run the compile/tests. It is highly unlikely it will work, as name-path on my local machine will differ. (Username is practically 100% bound to be different)

Related

Dockerfile not being found for Google cloud run

I've hit another bug. I'm now trying to set up continuous deployment for Google Cloud Run from my GitHub, and it's not finding my Dockerfile. I've tried various combinations with my file paths, but it still gives me the
Already have image (with digest): gcr.io/cloud-builders/docker
unable to prepare context: unable to evaluate symlinks in Dockerfile path: lstat /workspace/Dockerfile: no such file or directory
error.
This is my file structure from VSCode, and if I run the command from the first Exeplore folder, it finds the Dockerfile
The source directory is public, so I just can't figure out why it's not finding the Dockerfile.
https://github.com/Pierre-siddall/exeplore
Any advice at all would be greatly appreciated, thank you!
EDIT 1:
For some reason, the file path had some capitalisation that was different than what was on my VSCode - I suspect something on Github's end. Now I'm having issues getting the continuous deployment to actually update the page, but it is redeploying, but not updating now.

Question regarding package management in PyCharm IDE

I have been using PyCharm since I began learning Python because of its amazing UI that helped me learn a great deal about the language. As I progress into more advanced projects, I am beginning to prefer using a text editor / command line combo so that I can build my own venv's and have better access to source control. My question is, how does PyCharm manage custom local packages that I created so that I can import them wherever I want in the directory? For instance a project that I have built exclusively in PyCharm that runs will raise numerous import errors when trying to run that same project in VS code, or even a command line shell (yes I did have the PyCharm created venv activated before running on both attempts). For further examples, here is the project structure I am confused about:
RootDirectory
package_1_folder
__init__.py
pckg_1_class.py
program_using_pckg_1_folder
class_using_pckg1class.py
venv
The above structure has no issues being imported and used in PyCharm, however VS code / Sublime when used with command prompt / gitbash will raise either an ImportError or a ModuleNotFound error. I have even gone as far as adding the desired packages to my laptops windows PATH, using sys.path.append (I know this is not good practice I was only trying to get it to work), and even modified the .pylintrc file with the project path with no success. Any help explaining why these errors are happening would be greatly appreciated :)
NOTE:
I have been able to use the packages in VS code as long as the program importing the module is located at the root directory level, but not in its own folder in the root directory. Again, this statement WILL work in PyCharm, I just want to know how PyCharm is able to achieve this.
After numerous attempts to locate how the system was keeping tack of module within the IDE, I found that my answer was not visible from the IDE. I found a .idea folder in my root directory that contains a few .xml documents that manage the directory including where to read modules from.

VCRedist dependencies with PyInstaller

I'm trying to release a simple app with PyInstaller to work on windows7-X64 without any dependencies. But I got some problem with Microsoft VCRedist. In my host PC, I installed VCRedist 2015 and generated executable normally (not standalone). The VCRedist DLL files api-ms-win*.dll was included as expected in the generated directory and it worked fine on the target machine without VCRedist. Then I tried to generate a standalone app but this time when I executed it on target machine I got this error:
The procedure entry point ucrtbase_putch couldnot be located in the dynamic link library api-ms-win-crt-conio-l1-1-0.dll.
I checked the generated Temp folder (_MEI*) and found out that the right DLLs are right there and somehow the executable cannot use them. I created a copy of (_MEI*) folder and put the standalone executable next to it and surprisingly it worked. It seems that some of those DLLs exist in the windows directory of the target machine and it is trying to load them instead of (_MEI*) directory.
I also read the docs but didn't help much with this.

Bundle pdflatex to run on an AWS Lambda with a custom AMI image

My goal is to create an Amazon Lambda Function to compile .tex files into .pdf using the pdflatex tool through python.
I've built an EC2 instance using Amazon's AMI and installed pdflatex using yum:
yum install texlive-collection-latex.noarch
This way, I can use the pdflatex and my python code works, compiling my .tex into a .pdf the way I want.
Now, I need to create a .zip file bundle containing the pdflatex tool; latexcodec (a python library I've used, no problem with this one); and my python files: handler (lambda function handler) and worker (which compiles my .tex file).
This bundle is the deployment package needed to upload my code and libraries to Amazon Lambda.
The problem is: pdflatex has a lot of dependencies, and I'd have to gather everything in one place. I've found a script which does that for me:
http://www.metashock.de/2012/11/export-binary-with-lib-dependencies/
I've set my PATH to find the pdflatex binary at the new directory so I can use it and I had an issue: pdflatex couldn't find some dependencies. I was able to fix it by setting an environment variable to the folder where the script moved everything to:
export LD_LIBRARY_PATH="$LD_LIBRARY_PATH:/home/ec2-user/lambda/local/lib64:/home/ec2-user/lambda/local/usr/lib64"
At this point, I was running pdflatex directly, through bash. But my python script was firing an error when trying to use the pdflatex:
mktexfmt: No such file or directory
I can't find the format file `pdflatex.fmt'!
I was also able to solve this by moving the pdflatex.fmt and texmf.cnf files to my bundle folder and setting some environment variables as well:
export TEXFORMATS=/home/ec2-user/lambda/local/usr/bin
And now, my current problem, the python script keeps throwing the following error:
---! /home/ec2-user/lambda/local/usr/bin/pdflatex.fmt doesn't match pdftex.pool
(Fatal format file error; I'm stymied)
I've found some possible solutions; deleting a .texmf-var folder, which in my case, does not exist; using fmtutil, which I don't have in my AMI image...
1 - Was I missing any environment variable?
2 - Or moving my pdflatex binary and all its dependencies the wrong way?
3 - Is there any correct way to move a binary and all its dependencies so it can be used in other machine (considering the env variables)?
Lambda environment is a container and not a common EC2 Instance. All files in your .zip is deployed in /var/task/ inside the container. By the way, everything is mounted as read-only, except the directory /tmp. So, it's impossible to run a yum, for example.
For you case, I'd recommend you to put the binaries in your zip and invoke it in /var/task/<binary name>. Remember to put a binary compiled statically in a linux compatible with the container's kernel.
samoconnor is doing pretty much exactly what you want in https://github.com/samoconnor/lambdalatex. Note that he sets environment variables in his handler function
os.environ['PATH'] += ":/var/task/texlive/2017/bin/x86_64-linux/"
os.environ['HOME'] = "/tmp/latex/"
os.environ['PERL5LIB'] = "/var/task/texlive/2017/tlpkg/TeXLive/"
that might do the trick for you as-well.

svn: E155007: '/home/k/python/myproject_env/django-myproject/myproject' is not a working copy

I have been building django project and I want to use Subversion for version control on Ubuntu.
As you know, I need to keep most of the projects in the repository, however, some files and directories should only stay locally and not be tracked.
So I went to my project directory and typed the following command.
svn propedit svn:ignore myproject
But exception has occurred like this.
svn: E155007: '/home/k/python/myproject_env/django-myproject/myproject' is not a working copy
The path is ture that my app is located at /home/k/python/myproject_env/django-myproject/myproject
The guide says that when I type the command above, this will open a temporary file in the editor, where I need to put the following file and directory patterns for Subversion to ignore. But it doesn't work.
I already found the solution, but there is no answer anywhere.
I would like to know the solution. Thanks for your time.

Categories