Click here to Skip to main content
15,920,005 members
Please Sign up or sign in to vote.
1.00/5 (1 vote)
See more:
I am seeing a strange behavior with CryptStringToBinary Cryptography API.
Please see the below code (config: x64 Debug):

#include "stdafx.h"
#include <windows.h>
#include <strsafe.h>
#include <iostream>
#include <exception>
void main()
	DWORD dwSkip;
	DWORD dwFlags;
	DWORD dwDataLen;
	//LPCWSTR pszInput = L"jAAAAAECAAADZgAAAKQAAGdnNL1l56BWGFjDGR3RpxTQqqn6DAw3USv2eMkJYm4t";  //this works fine

	LPCWSTR	pszInput = L"MyTest"; //doesnt work, API returns false,error code 0x0000000d

	// Determine the size of the BYTE array and allocate memory.
	if(! CryptStringToBinary(
		_tcslen( pszInput ) + 1, 
		&dwFlags ) )
		DWORD dw = GetLastError(); //0x0000000d: The data is invalid
		throw std::exception( "Error computing Byte length." );

	BYTE *pbyteByte = NULL;
		pbyteByte = new BYTE[ dwDataLen ];
		if( !pbyteByte ) 
			throw std::exception( "Wrong array size." );
	catch( std::exception &ex )
		throw ex;
		throw std::exception( "Out of memory." );
	return ;

With one pszInput string, the CryptStringToBinary returns true and if i use L"MyTest" as pszInput string it returns false with error code 0x0000000d . I do see, there is some issue with length of the string passed to the API. When I pass the length without null terminated char, the API returns true always. But in this case, is the BYTE length returned correct?

Could anybody help me understanding the reason behind this behavior?
Also, in which case the API would return correct BYTE length?

Thanks in advance!
Updated 7-Jan-23 6:00am

1 solution

you're telling the API you're passing the input in as base-64


yet "MyTest" isnt a base-64 coded 'string'

I wouldnt blame the api for chucking a mental, personally - its behaviour seems normal to me
Share this answer
Nikhil Sisodia 7-Oct-13 3:14am    
Thanks Garth J Lancaster for your input!

I got your view on format of the string as CRYPT_STRING_BASE64.
In this case, what format shall I be using? Is it ok to use CRYPT_STRING_ANY

Also, length i am passing to this API is including the NULL char, whereas MSDN says
cchString [in]
The number of characters of the formatted string to be converted, not including the terminating NULL character.

But in my case though I am passing length including the NULL char, the first pszInput input string returns true.
Is the usage of length parameter in my API correct?

Sergey Alexandrovich Kryukov 7-Oct-13 3:44am    
You should not use any format specifier, you should not use the input string you are using.
In other words, the code is correct, but what you are trying to do with "MyTest" makes no sense. This is a correct behavior, to reject this string, as it is not formatted to any of the expected formats. Probably you did not get the purpose of this API. Read it again.
Sergey Alexandrovich Kryukov 7-Oct-13 3:41am    
Of course, a 5.

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)

CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900