openchange provision failure - python

I am currently looking at openchange because I find it fascinating that there is actually something out there that can effectively work as an exchange server. I followed the directions verbatim however, I keep running into the same problem:
When I get to the part where I need to provision openchange detailed here:
http://www.openchange.org/cookbook/configuring.html
I am directed to type in the following command:
./setup/openchange_provision --standalone
I keep getting the following error:
Error: "(53, 'schema_data_add: updates are not allowed: reject request\n')" when adding element:
dn: CN=ms-Exch-Access-Control-Map,CN=Schema,CN=Configuration,DC=domain,DC=local
objectClass: top
objectClass: attributeSchema
cn: ms-Exch-Access-Control-Map
distinguishedName: CN=ms-Exch-Access-Control-Map,CN=Schema,CN=Configuration,DC=domain,DC=local
attributeID: 1.2.840.113556.1.4.7000.102.64
attributeSyntax: 2.5.5.12
isSingleValued: TRUE
showInAdvancedViewOnly: TRUE
adminDisplayName: ms-Exch-Access-Control-Map
adminDescription: ms-Exch-Access-Control-Map
oMSyntax: 64
searchFlags: 0
lDAPDisplayName: msExchAccessControlMap
name: ms-Exch-Access-Control-Map
#schemaIDGUID: 8ff54464-b093-11d2-aa06-00c04f8eedd8
isMemberOfPartialAttributeSet: FALSE
objectCategory: CN=Attribute-Schema,CN=Schema,CN=Configuration,DC=domain,DC=local
[!] error while provisioning the Exchange schema classes (53): schema_data_add: updates are not allowed: reject request
Traceback (most recent call last):
File "./setup/openchange_provision", line 90, in <module>
openchange.provision(setup_path, provisionnames, lp, creds)
File "python/openchange/provision.py", line 742, in provision
install_schemas(setup_path, names, lp, creds, reporter)
File "python/openchange/provision.py", line 441, in install_schemas
provision_schema(sam_db, setup_path, names, reporter, schema['path'], schema['description'], schema['modify_mode'])
File "python/openchange/provision.py", line 227, in provision_schema
sam_db.add_ldif(el, ['relax:0'])
File "/usr/local/samba/lib/python2.7/site-packages/samba/__init__.py", line 224, in add_ldif
self.add(msg, controls)
_ldb.LdbError: (53, 'schema_data_add: updates are not allowed: reject request\n')
I am at a complete loss as to what might be wrong, I have rebuilt this many times and keep running into the same roadblock. Any help with this would be greatly appreciated.

If you look at output of Openchange, you can find a root cause of the problem:
[!] error while provisioning the Exchange schema classes (53): schema_data_add: updates are not allowed: reject request
Add the following line to the [global] section in your smb.conf to allow schema changing:
dsdb:schema update allowed=true

Related

Why is UPnPy discover throwing AttributeError?

I am working with UPnPy, and I immediately notice an issue when attempting to discover devices on my local network. Here is the basic code I am using:
import upnpy
upnp = upnpy.UPnP()
devices = upnp.discover()
This throws the following exception:
Traceback (most recent call last):
File "C:\Users\name\Projects\pythonProject\main.py", line 5, in <module>
devices = upnp.discover()
File "C:\Users\name\Projects\pythonProject\venv\lib\site-packages\upnpy\upnp\UPnP.py", line 33, in discover
for device in self.ssdp.m_search(discover_delay=delay, st='upnp:rootdevice', **headers):
File "C:\Users\name\Projects\pythonProject\venv\lib\site-packages\upnpy\ssdp\SSDPRequest.py", line 50, in m_search
devices = self._send_request(self._get_raw_request())
File "C:\Users\name\Projects\pythonProject\venv\lib\site-packages\upnpy\ssdp\SSDPRequest.py", line 100, in _send_request
device = SSDPDevice(addr, response.decode())
File "C:\Users\name\Projects\pythonProject\venv\lib\site-packages\upnpy\ssdp\SSDPDevice.py", line 87, in __init__
self._get_services_request()
File "C:\Users\name\Projects\pythonProject\venv\lib\site-packages\upnpy\ssdp\SSDPDevice.py", line 23, in wrapper
return func(device, *args, **kwargs)
File "C:\Users\name\Projects\pythonProject\venv\lib\site-packages\upnpy\ssdp\SSDPDevice.py", line 54, in wrapper
return func(instance, *args, **kwargs)
File "C:\Users\name\Projects\pythonProject\venv\lib\site-packages\upnpy\ssdp\SSDPDevice.py", line 171, in _get_services_request
event_sub_url = service.getElementsByTagName('eventSubURL')[0].firstChild.nodeValue
AttributeError: 'NoneType' object has no attribute 'nodeValue'
I have been researching the cause of this but I have found nothing. I am using UPnPy version 1.1.8. I use PyCharm as my IDE. I've tried using previous versions of UPnPy but none seem to be working. Any help would be appreciated. Thanks!
Most likely you have a non-compliant UPnP Device in your home network that is serving a non-standard/broken XML at their description location, and upnpy was not smart enough to handle the parsing error and perhaps ignore that device.
That scenario is more common than you might think: many Smart TVs (LG ones for sure) have an embedded device that advertises as UPnP but their description endpoint answers JSON instead of XML!
Some suggestions:
Use a different library or app (you could try my own), at least to identify the culprit. Check turn on verbosity and look for warnings and parsing error logs
Use a scanner such as tcpdump to sniff network packets (UDP port 1900) for SSDP NOTICE advertises, and manually open each LOCATION URL in a browser to see if they're valid XML
Selectively turn off / unplug devices you think might be UPnP-enabled, such as Smart TVs, Home Theaters, Video Game consoles, Routers, etc, to see wich one was serving the bogus XML.
Edit your local copy of upnpy to handle that error, for example enclose the function/line with a try/except block and print some details about what is it trying to parse prior to the error.

Extract all confirmed transactions from a bitcoin block using python

I want to extract all confirmed transactions from the bitcoin blockchain. I know there are repos out there (e.g. https://github.com/znort987/blockparser) but I want to write something myself for my better understanding.
I am have tried the following code after having downloaded far more than 42 blocks and while running bitcoind (minimal example):
from bitcoin.rpc import RawProxy
proxy = RawProxy()
blockheight=42
block = proxy.getblock(proxy.getblockhash(blockheight))
tx_list = block['tx']
for tx_id in tx_list:
raw_tx = proxy.getrawtransaction(tx_id)
This yields the following error:
Traceback (most recent call last):
File "<stdin>", line 2, in <module>
File "/home/donkeykong/.local/lib/python3.7/site-packages/bitcoin/rpc.py", line 315, in <lambda>
f = lambda *args: self._call(name, *args)
File "/home/donkeykong/.local/lib/python3.7/site-packages/bitcoin/rpc.py", line 239, in _call
'message': err.get('message', 'error message not specified')})
bitcoin.rpc.InvalidAddressOrKeyError: {'code': -5, 'message': 'No such mempool transaction. Use -txindex or provide a block hash to enable blockchain transaction queries. Use gettransaction for wallet transactions.'}
Could anyone elucidate what I am misunderstanding?
For reproduction:
python version: Python 3.7.3
I installed the bitcoin.rpc via: pip3 install python-bitcoinlib
bitcoin-cli version: Bitcoin Core RPC client version v0.20.1
Running the client as bitcoind -txindex solves the problem as it maintains the full transaction index. I should have spent more attention to the error message...
Excerpt from bitcoind --help:
-txindex
Maintain a full transaction index, used by the getrawtransaction rpc
call (default: 0)

Using pysolr getting 400 error when trying to add data to solr

Using pysolr trying to add document to solr with Python3.9 and getting below error 400 even with only 1 or 2 fields.
The fields that I'm using here are dynamic fields.
No issue with connecting to solr.
#/usr/bin/python
import pysolr
solr = pysolr.Solr('http://localhost:8080/solr/', always_commit=True)
if solr.ping():
print('connection successful')
docs = [{'id':'123c', 's_chan_name': 'TV-201'}]
solr.add(docs)
res = solr.search('123c')
print(res)
Getting below error:
connection successful
Traceback (most recent call last):
File "/tmp/test-solr.py", line 8, in <module>
solr.add(docs)
File "/usr/local/lib/python3.9/site-packages/pysolr.py", line 1042, in add
return self._update( File "/usr/local/lib/python3.9/site-packages/pysolr.py", line 568, in
_update
return self._send_request( File "/usr/local/lib/python3.9/site-packages/pysolr.py", line 463, in
_send_request
raise SolrError(error_message % (resp.status_code, solr_message))
pysolr.SolrError: Solr responded with an error (HTTP 400): [Reason: None]
<html><head><meta http-equiv="Content-Type" content="text/html;charset=utf-8"/>
<title>Error 400 Unexpected character &apos;[&apos; (code 91) in prolog;
expected &apos;<&apos; at [row,col {unknown-source}]: [1,1]</title>
</head><body><h2>HTTP ERROR 400</h2><p>Problem accessing /solr/update/.
Reason:<pre> Unexpected character &apos;[&apos; (code 91) in prolog;
expected &apos;<&apos; at [row,col {unknown-source}]: [1,1]</pre>
</p><hr><a href="http://eclipse.org/jetty">
Powered by Jetty://9.4.15.v20190215</a><hr/></body></html>
I’d recommend looking at the debug logs (which might require you to use logging.basicConfig() to enable them) and testing the URLs it’s using yourself. Depending on your Solr configuration, it might be the case that your Solr URL needs to have the core at the end (e.g. http://localhost:8983/solr/mycore).
In general, most reports like this which we get turn out to be oddities in how someone configured Solr. In addition to the debug logging, I highly recommend looking through the test suite since that does everything needed to run Solr and can be a handy point for comparison:
https://github.com/django-haystack/pysolr
solr.add method supports only one attribute
docs = {'id':'123c', 's_chan_name': 'TV-201'}
solr.add(docs)
it is not suitable for iterable.
docs = [{'id':'123c', 's_chan_name': 'TV-201'}]
solr.add_many(docs)
it will works

python - openchange provision failure

Ubuntu 12.04.5
When I get to the part where I need to provision openchange detailed here: http://www.openchange.org/cookbook/configuring.html
I am directed to type in the following command:
:~/openchange# sudo PYTHONPATH=$PYTHONPATH ./setup/openchange_provision --standalone
I keep getting the following error:
NOTE: This operation can take several minutes
[+] Step 1: Register Exchange OIDs
[+] Step 2: Add Exchange attributes to Samba schema
schema_data_add: updates are not allowed: reject request
Error: "(53, 'schema_data_add: updates are not allowed: reject request\n')" when adding element:
ms-Exch-Access-Control-Map
Contains the mapping for the access controls.
dn: CN=ms-Exch-Access-Control-Map,CN=Schema,CN=Configuration,DC=stremblay,DC=local
objectClass: top
objectClass: attributeSchema
cn: ms-Exch-Access-Control-Map
distinguishedName: CN=ms-Exch-Access-Control-Map,CN=Schema,CN=Configuration,DC=stremblay,DC=local
attributeID: 1.2.840.113556.1.4.7000.102.64
attributeSyntax: 2.5.5.12
isSingleValued: TRUE
showInAdvancedViewOnly: TRUE
adminDisplayName: ms-Exch-Access-Control-Map
adminDescription: ms-Exch-Access-Control-Map
oMSyntax: 64
searchFlags: 0
lDAPDisplayName: msExchAccessControlMap
name: ms-Exch-Access-Control-Map
schemaIDGUID: 8ff54464-b093-11d2-aa06-00c04f8eedd8
isMemberOfPartialAttributeSet: FALSE
objectCategory: CN=Attribute-Schema,CN=Schema,CN=Configuration,DC=stremblay,DC=local
[!] error while provisioning the Exchange schema classes (53): schema_data_add: updates are not allowed: reject request
Traceback (most recent call last):
File "./setup/openchange_provision", line 109, in <module>
openchange.provision(setup_path, provisionnames, lp, creds)
File "/usr/local/samba/lib/python2.7/site-packages/openchange/provision.py", line 777, in provision
install_schemas(setup_path, names, lp, creds, reporter)
File "/usr/local/samba/lib/python2.7/site-packages/openchange/provision.py", line 447, in install_schemas
provision_schema(sam_db, setup_path, names, reporter, schema['path'], schema['description'], schema['modify_mode'])
File "/usr/local/samba/lib/python2.7/site-packages/openchange/provision.py", line 233, in provision_schema
sam_db.add_ldif(el, ['relax:0'])
File "/usr/local/samba/lib/python2.7/site-packages/samba/__init__.py", line 224, in add_ldif
self.add(msg, controls)
_ldb.LdbError: (53, 'schema_data_add: updates are not allowed: reject request\n')
root#mail:~/openchange# error while provisioning the Exchange schema classes (53): schema_data_add: updates are not allowed: reject request
Any help to solve this error
Add this on your smb.conf
sudo nano /etc/samba/smb.conf
# [global]
dsdb:schema update allowed=true

How to fix IncompleteRead error on Linux using Py2Neo

I am updating data on a Neo4j server using Python (2.7.6) and Py2Neo (1.6.4). My load function is:
from py2neo import neo4j,node, rel, cypher
session = cypher.Session('http://my_neo4j_server.com.mine:7474')
def load_data():
tx = session.create_transaction()
for row in dataframe.iterrows(): #dataframe is a pandas dataframe
name = row[1].name
id = row[1].id
merge_query = "MERGE (a:label {name:'%s', name_var:'%s'}) " % (id, name)
tx.append(merge_query)
tx.commit()
When I execute this from Spyder in Windows it works great. All the data from the dataframe is committed to neo4j and visible in the graph. However, when I run this from a linux server (different from the neo4j server) I get the following error at tx.commit(). Note that I have the same version of python and py2neo.
INFO:py2neo.packages.httpstream.http:>>> POST http://neo4j1.qs:7474/db/data/transaction/commit [1360120]
INFO:py2neo.packages.httpstream.http:<<< 200 OK [chunked]
ERROR:__main__:some part of process failed
Traceback (most recent call last):
File "my_file.py", line 132, in load_data
tx.commit()
File "/usr/local/lib/python2.7/site-packages/py2neo/cypher.py", line 242, in commit
return self._post(self._commit or self._begin_commit)
File "/usr/local/lib/python2.7/site-packages/py2neo/cypher.py", line 208, in _post
j = rs.json
File "/usr/local/lib/python2.7/site-packages/py2neo/packages/httpstream/http.py", line 563, in json
return json.loads(self.read().decode(self.encoding))
File "/usr/local/lib/python2.7/site-packages/py2neo/packages/httpstream/http.py", line 634, in read
data = self._response.read()
File "/usr/local/lib/python2.7/httplib.py", line 543, in read
return self._read_chunked(amt)
File "/usr/local/lib/python2.7/httplib.py", line 597, in _read_chunked
raise IncompleteRead(''.join(value))
IncompleteRead: IncompleteRead(128135 bytes read)
This post (IncompleteRead using httplib) suggests that is an httplib error. I am not sure how to handle since I am not calling httplib directly.
Any suggestions for getting this load to work on Linux or what the IncompleteRead error message means?
UPDATE :
The IncompleteRead error is being caused by a Neo4j error being returned. The line returned in _read_chunked that is causing the error is:
pe}"}]}],"errors":[{"code":"Neo.TransientError.Network.UnknownFailure"
Neo4j docs say this is an unknown network error.
Although I can't say for sure, this implies some kind of local network issue between client and server rather than a bug within the library. Py2neo wraps httplib (which is pretty solid itself) and, from the stack trace, it looks as though the client is expecting more chunks from a chunked response.
To diagnose further, you could make some curl calls from your Linux application server to your database server and see what succeeds and what doesn't. If that works, try writing a quick and dirty python script to make the same calls with httplib directly.
UPDATE 1: Given the update above and the fact that the server streams its responses, I'm thinking that the chunk size might represent the intended payload but the error cuts the response short. Recreating the issue with curl certainly seems like the best next step to help determine whether it is a fault in the driver, the server or something else.
UPDATE 2: Looking again this morning, I notice that you're using Python substitution for the properties within the MERGE statement. As good practice, you should use parameter substitution at the Cypher level:
merge_query = "MERGE (a:label {name:{name}, name_var:{name_var}})"
merge_params = {"name": id, "name_var": name}
tx.append(merge_query, merge_params)

Categories