I have a python code that generates a PBKDF2 sha1 hash of a password using the hashlib.pbkdf2_hmac method. Then I use that password digest in a dotnet framework 4.5 program to verify it against the same password. The C# program returns false which suggests that the hash produced from the python program is incorrect.
The key is in this format: #iterations|salt|key.
Then, I take that key and I try to verify it using a dotnet framework app using via method:
public static bool IsValid(string testPassword, string originalDelimitedHash)
{
//extract original values from delimited hash text
var originalHashParts = originalDelimitedHash.Split('|');
var origIterations = Int32.Parse(originalHashParts[0]);
var origSalt = Convert.FromBase64String(originalHashParts[1]);
var originalHash = originalHashParts[2];
//generate hash from test password and original salt and iterations
var pbkdf2 = new Rfc2898DeriveBytes(testPassword, origSalt, origIterations, HashAlgorithmName.SHA1);
byte[] testHash = pbkdf2.GetBytes(20);
var hashStr = Convert.ToBase64String(testHash);
if (hashStr == originalHash)
return true;
return false;
}
my python program:
from hashlib import pbkdf2_hmac
from base64 import b64encode
from os import urandom
def generate_password_hash(password:string):
encodedPass = password.encode('utf8')
random_bytes = urandom(20)
salt = b64encode(random_bytes)
iterations = 5000
key = pbkdf2_hmac('sha1', encodedPass, salt, iterations, dklen=20)
result = f'{iterations}|{salt.decode("utf-8")}|{binascii.hexlify(key).decode("utf-8")}'
return result
So if my password is hDHzJnMg0O the resulting digest from the above python method would be something like 5000|J5avBy0q5p9R/6cgxUpu6+6sW7o=|2445594504c9ffb54d1f11bbd0b385e3e37a5aca
So if I take that and supply it to my C# IsValid method (see below) it returns false which means the passwords do not match
static void Main(string[] args)
{
var pass = "hDHzJnMg0O";
var hash = "5000|J5avBy0q5p9R/6cgxUpu6+6sW7o=|2445594504c9ffb54d1f11bbd0b385e3e37a5aca";
var isValid = IsValid(pass, hash); // returns False
}
The Python code:
uses b64encode(random_bytes) as salt for the PBKDF2 call. This is rather unusual (but not a bug). Typically the raw data, i.e. random_bytes, is applied as salt and passed to the PBKDF2 call. With the Base64 encoding only the string would be created.
hex encodes the key (i.e. the return value of the PBKDF2 call).
The C# code is different in these points and:
uses the raw data (i.e. random_bytes from the Python side) for the PBKDF2 call, i.e. the salt from the Python side is Base64 decoded.
Base64 encodes the key (i.e. the return value of the PBKDF2 call)
Changes in the C# code for compatibility with the Python code (of course the changes could also be made in the Python code, but the Python code seems to be the reference):
...
var origSalt = Encoding.UTF8.GetBytes(originalHashParts[1]); // Convert.FromBase64String(originalHashParts[1]);
...
var hashStr = Convert.ToHexString(testHash); // Convert.ToBase64String(testHash);
...
For the latter, Convert.ToHexString() was used, which is available since .NET 5. For other .NET versions see e.g. here.
Furthermore, since the hex encoded values are compared and the different implementations are not standardized regarding lower (e.g. binascii.hexlify(key)) and upper case letters (e.g. Convert.ToHexString(testHash)), it is more robust to convert both strings uniformly, e.g.:
if (hashStr.ToUpper() == originalHash.ToUpper())
return true;
With these changes, validation with the C# code works.
Edit (with regard to the change in the Python code addressed in the comment):
If in the Python code random_bytes is used as salt and the salt is Base64 encoded for concatenation, then in the C# code the Base64 encoded salt must be Base64 decoded again (as in the original C# code).
Related
I am rewriting the codes written in C# with Python. In a part of the C# program, a string is signed with the RSA algorithm and sent to an API.
I have rewritten all parts of the C# program in Python, including the string signature section
The output in the signature of the C# program is the same as the signature of the Python program
But the APi in question does not confirm the signature generated by Python, even though the signature string in C# and Python is exactly the same.
I would be grateful if someone could help me
And whether the output of the RSA algorithm for signature is the same or changes for the same string at each time of encryption (signature)?
In Python, I use the following
from Cryptodome. Signature import pkcs1_15
from Cryptodome. Hash import SHA256
from Cryptodome.PublicKey import RSA
import base64
...
I rewrote the signature of a string that is in C# programming language in Python language (with RSA algorithm and key length of 2048) and it showed me the output of the same string from both programs (C# and Python).
When I send this signature string to an API, it does not receive an error and accepts the signature generated with C#, but it receives an error with the signature generated in Python and does not accept the signature.
What I expected was that since the signature output from both programming languages is the same (the same strings), the desired API should not have any problem with it.
Is my view wrong?
C#
public static string SignData(String stringToBeSigned, string privateKey)
{
var pem = "-----BEGIN PRIVATE KEY-----\n" + privateKey + "\n-----END PRIVATE KEY-----";
PemReader pr = new PemReader(new StringReader(pem));
AsymmetricKeyParameter privateKeyParams = (AsymmetricKeyParameter)pr.ReadObject();
RSAParameters rsaParams = DotNetUtilities.ToRSAParameters((RsaPrivateCrtKeyParameters)privateKeyParams);
RSACryptoServiceProvider csp = new RSACryptoServiceProvider();
csp.ImportParameters((RSAParameters)rsaParams);
var dataBytes = Encoding.UTF8.GetBytes(stringToBeSigned);
return Convert.ToBase64String(csp.SignData(dataBytes, HashAlgorithmName.SHA256, RSASignaturePadding.Pkcs1));
}
python :
key = RSA.import_key(open('rsa.private').read())
h = SHA256.new(normalize.encode('utf-8'))
signature = pkcs1_15.new(key).sign(h)
base64_bytes = base64.b64encode(signature)
base64_signature = base64_bytes.decode('ascii')
print(base64_signature)
I am working on converting the encryption function of AES/GCM-256 in C# to Python.
i found the following code in Python and i am using it, but the problem is that my data in the Python function is the same as the C# data.
The output string (encoded) produced by my Python function is not the same as the output of the C# function, although my inputs (key, iv and data) are the same in both functions.
I would be grateful if someone could help me
My keyis: b'4fda3c622e966e0839441401bbd3b8f191d4267bf5f19b40812a34b212fd3ed9'
My iv is: b'4fda3c622e966e0839441401bbd3b8f191d4267bf5f19b40812a34b212fd3ed9'
C# function
public static string AesEncrypt(byte[] payload, byte[] key, byte[] iv)
{
var cipher = new GcmBlockCipher(new AesEngine());
byte[] baPayload = new byte[0];
cipher.Init(true, new AeadParameters(new KeyParameter(key), 128, iv, baPayload));
var cipherBytes = new byte[cipher.GetOutputSize(payload.Length)];
int len = cipher.ProcessBytes(payload, 0, payload.Length, cipherBytes, 0);
cipher.DoFinal(cipherBytes, len);
return Convert.ToBase64String(cipherBytes);
}
It can be converted to bytes using the following function PASSPHRASE in C#:
public static byte[] StringToByteArray(string hex)
{
return Enumerable.Range(0, hex.Length)
.Where(x => x % 2 == 0)
.Select(x => Convert.ToByte(hex.Substring(x, 2), 16))
.ToArray();
}
python:
PASSPHRASE= b'a4b42ed2702cb1b00a14f39a88c719cb04e5e8b29e2479634c990258e327483'
def AES_encrypt(data,iv):
global PASSPHRASE
data_json_64 = data
key = binascii.unhexlify(PASSPHRASE)
cipher = AES.new(key, AES.MODE_GCM, iv)
x = cipher.encrypt(data)
return x
Because all my data is the same, I expect my output to be the same, but it is not
My input test string to both C# and Python is::
"In publishing and graphic design, Lorem ipsum is a"
My iv to both C# and Python is:
'4fda3c622e966e0839441401bbd3b8f191d4267bf5f19b40812a34b212fd3ed9'
The encoded output in C# is:
02Em9Vve6fWtAcVNesIXzagoB327EmskwMZdRippAAaxqAzkp0VeGSjctbaguqA/01CnPHB2PkRDDOxjgZ9pAfu2
The encoded output in Python is:
HudpKzIov7lNt4UNng+a9P/FLXrzdenwDBT4uFYhIUc3XOS7TpaCzxja8I+zHCdXnvk=
After implementing suggested changes from your comments and some minor tweaks to your code, it is possible to reproduce your C# result. The main problem was the handle of a ciphertext and a tag. As #Topaco has stated in the comments, C# implicitly concatenates ciphertext and tag as ciphertext|tag, which will mean you need to do the same in your code. If you follow the GCM documentation you will see that you need to use encrypt_and_digest() function in order to obtain the ciphertext and the tag. Here is what you need to change:
cipher = AES.new(key, AES.MODE_GCM, iv) → cipher = AES.new(key, AES.MODE_GCM, binascii.unhexlify(iv))
x = cipher.encrypt(data) → x, key = cipher.encrypt_and_digest(data)
return x → return x + key
These changes should help you fix your code. I also have some other suggestions. I wouldn't use global variables in your code. For starters, this will mean that your function should be AES_encrypt(data, key, iv) and not AES_encrypt(data, iv). Why are global variables a bad idea? In this particular case it makes no sense you use a global variable for password as you shouldn't use the same password for encryption of everything. Additionally, when talking about encryption, you will see that every algorithm is using a key which is not the same as a password. Here you are implementing the password based encryption. However, you are not doing it correctly, as you are using a password (passphrase) as a cryptographic key. The idea behind a password based encryption is to use a good digest/key stretching technique to create a key from your password. Finally, you should take a look at the NIST Special Publication 800-38D to learn more about the GCM and NIST's suggested parameters when it comes to implementing a safe encryption.
I have a 10 character code that I want to sign by my python program, then put both the code as well as the signature in an URL, which then get's processed by a PHP SLIM API. Here the signature should get verified.
I generated my RSA keys in python like this:
from cryptography.hazmat.primitives import hashes
from cryptography.hazmat.primitives.asymmetric import padding
from cryptography.hazmat.backends import default_backend
from cryptography.hazmat.primitives.serialization import load_pem_private_key
from cryptography.hazmat.primitives.asymmetric import rsa
from cryptography.hazmat.primitives import serialization
def gen_key():
private_key = rsa.generate_private_key(
public_exponent=65537, key_size=2048, backend=default_backend()
)
return private_key
def save_key(pk):
pem_priv = pk.private_bytes(
encoding=serialization.Encoding.PEM,
format=serialization.PrivateFormat.PKCS8,
encryption_algorithm=serialization.NoEncryption()
)
with open(os.path.join('.', 'private_key.pem'), 'wb') as pem_out:
pem_out.write(pem_priv)
pem_pub = pk.public_key().public_bytes(
encoding=serialization.Encoding.PEM,
format=crypto_serialization.PublicFormat.SubjectPublicKeyInfo
)
with open(os.path.join('.', 'public_key.pem'), 'wb') as pem_out:
pem_out.write(pem_pub)
def main():
priv_key = gen_key()
save_key(priv_key)
I sign the key like this in python:
private_key = load_key()
pub_key = private_key.public_key()
code = '09DD57CE10'
signature = private_key.sign(
str.encode(code),
padding.PSS(
mgf=padding.MGF1(hashes.SHA256()),
salt_length=padding.PSS.MAX_LENGTH
),
hashes.SHA256()
)
The url is built like this
my_url = 'https://www.exmaple.com/codes?code={}&signature={}'.format(
code,
signature.hex()
)
Because the signature is a byte object I'm converting it to a string using the .hex() function
Now, in PHP, I am trying to verify the code and signature:
use phpseclib3\Crypt\PublicKeyLoader;
$key = PublicKeyLoader::load(file_get_contents(__DIR__ . "/public_key.pem"));
echo $key->verify($code, pack('h*', $signature)) ? 'yes' : 'no';
I also tried using PHP openssl_verify
$pub_key = file_get_contents(__DIR__ . "/public_key.pem");
$res = openssl_verify($code, pack('n*', $signature), $pub_key, OPENSSL_ALGO_SHA256);
However, it always tells me the signature is wrong, when I obviously know, that in general it is the correct signature. The RSA keys are all the correct and same keys in both python and php.
I think the issue is with the signature and how I had to convert it to a string and then back to a bytes like string in both python and php.
The Python code uses PSS.MAX_LENGTH as the salt length. This value denotes the maximum salt length and is recommended in the Cryptography documentation (s. here):
salt_length (int) – The length of the salt. It is recommended that this be set to PSS.MAX_LENGTH
In RFC8017, which specifies PKCS#1 and thus also PSS, the default value of the salt length is defined as the output length of the hash (s. A.2.3. RSASSA-PSS):
For a given hashAlgorithm, the default value of saltLength is the octet length of the hash value.
Most libraries, e.g. PHPSECLIB, apply for the default value of the salt length the default defined in RFC8017, i.e. the output length of the hash (s. here). Therefore the maximum salt length must be set explicitly. The maximum salt length is given by (s. here):
signature length (bytes) - digest output length (bytes) - 2 = 256 - 32 - 2 = 222
for a 2048 bits key and SHA256.
Thus, the verification in the PHP code must be changed as follows:
$verified = $key->
withPadding(RSA::SIGNATURE_PSS)->
//withHash('sha256')-> // default
//withMGFHash('sha256')-> // default
withSaltLength(256-32-2)-> // set maximum salt length
verify($code, pack('H*', $signature)); // alternatively hex2bin()
Note that in the posted code of the question h (hex string, low nibble first) is specified in the format string of pack(). I' ve chosen the more common H (hex string, high nibble first) in my code snippet which is also compatible with Python's hex(). Ultimately, the format string to choose depends on the encoding applied in the Python code.
Using this change, on my machine, the signature generated with the Python code can be successfully verified with the PHP code.
Alternatively, of course, the salt length of the Python code can be adapted to the output length of the digest (32 bytes in this case).
By the way, a verification with openssl_verify() is not possible, because PSS is not supported.
I wrote an application in Nodejs that encrypts user passwords using AES-256-CTR :
const crypto = require('crypto')
const masterkey = 'azertyuiopazertyuiopazertyuiopaz'
const cipher = crypto.createCipher('aes-256-ctr', masterkey)
console.log(cipher.update('antoine', 'utf8', 'hex') + cipher.final('hex')) //=> 6415bc70ad76c6
It then gets persisted into a database and now I'm trying to decipher it from a Python script using PyCrypto like this :
masterkey = 'azertyuiopazertyuiopazertyuiopaz'
password = '6415bc70ad76c6'
from Crypto.Cipher import AES
import os
import binascii
counter = os.urandom(16)
# counter = bytes(16) # does not work
# counter = masterkey[0:16].encode() # does not work
cipher = AES.new(masterkey, AES.MODE_CTR, counter=lambda: counter)
print(cipher.decrypt(binascii.a2b_hex(password)))
But it gives me completely wrong results here.
Do you know what I am missing ?
EDIT
Thanks to zaph, it appears that the way my Javascript code encrypts data is insecure. I still have to figure out what IV is being used internally by Node. I've tried many without success
masterkey[0:16].encode()
bytes(16)
Update based on new information in the question: The best bet is that Nodejs is using a default counter value.
The same counter value must be used for both encryption and decryption. But no counter value is provided on encryption and a random value is used on decryption so it can never work.
Use: crypto.createCipheriv(algorithm, key, iv) where iv is the random counter initial value.
It is necessary to create a random counter value on encryption and save it so that the same initial counter value can be used on decryption. One option is to prefix the encrypted data with the counter value, it does not need to be secret. Then on decryption it can be split from the encrypted data and used.
Also when using CTR mode the same initial counter value must never be use again with the same key.
See CTR mode
PyCrypto documentation CTR mode:
MODE_CBC
Cipher-Block Chaining (CBC). Each of the ciphertext blocks depends on the current and all previous plaintext blocks. An Initialization Vector (IV) is required.
The IV is a data block to be transmitted to the receiver. The IV can be made public, but it must be authenticated by the receiver and it should be picked randomly.)
The IV is the initial counter value.
[Nodejs dociumewnrtation: Class: Cipher:
crypto.createCipheriv(algorithm, key, iv)
algorithm <string>
key <string> | <Buffer> | <TypedArray> | <DataView>
iv <string> | <Buffer> | <TypedArray> | <DataView>
Creates and returns a Cipher object, with the given algorithm, key and initialization vector (iv).
A server protocol requires me to derive a password hash with a limited key size. This is the given JavaScript + CryptoJS implementation:
var params = {keySize: size/32, hasher: CryptoJS.algo.SHA512, iterations: 5000}
var output = CryptoJS.PBKDF2(password, salt, params).toString();
I want to re-implement this in Python using Passlib, i.e. something like
from passlib.hash import pkbdf2_sha512
output = pbkdf2_sha512.hash(password, salt=salt, rounds=5000)
The Passlib API does not allow me to specify the key size. How to do it though?
If the derived key it to long just truncate it to the required length. Each byte is just as valid as every other byte, it makes no difference which bytes you use, there is no ordering.