I have a simple http request in my datamapper which always returns:
Bad Request: Received a GET request with a text/plain; charset=UTF-8 Content-Type
My datamapper script in the Action step is as follow:
var myjson_customerInfo = {“files”:[{“customername”:“name1”, “category”:“cat2”}]};
var request = createHTTPRequest();
var my_url = “http://localhost:8080/get_customer_id?jsonInfo=” + JSON.stringify(myjson_customerInfo);
request.open(“GET”,my_url,“”,“”);
request.send();
if(request.status == “200”){
var response = request.getResponseBody().replace(/(\xFF)|(\xFE)|(\x00)/g, “”);
logger.info('ajax response is: ’ + response);
result = JSON.parse(response);
}else {
logger.info('ajax response is: ’ + request.getResponseBody().replace(/(\xFF)|(\xFE)|(\x00)/g, “”));
}
Obviously I have a simple HTTP process in the workflow with the “get_customer_id” action.
My HTTP server logs shows:
HTTP Request - 22:03:23
Connection from 127.0.0.1
Port used: 8080
URL: /get_customer_id
Bad Request: Received a GET request with a text/plain; charset=UTF-8 Content-Type header
The exact same request works with no issue from a web browser. It just doesn’t work from the datamapper v2020.1
Well it appears you just uncovered 2 distinct issues that will require us to revisit how we implemented this functionality in the applications.
The server is rejecting any GET request that contains a Content-Type (it only accepts Content-Type headers for POST or PUT requests). Whether or not it should actually reject those requests is open for debate (I personally think it shouldn’t, but there are different opinions about this on the web).
The DataMapper automatically sets a Content-Type header for any request, whether a GET or a POST. In the case of a GET, it just shouldn’t do it (it should be up to you to specify a content-type if you deem it necessary… which it shouldn’t be). In most cases, servers will just ignore this Content-Type header, but not our NodeJS server.
So you hit the perfect storm with those two issues. Sorry about that. I will open tickets in our database for R&D to take a look at those issues.
Fortunately, there is an easy way to get around the problem: change your request from GET to POST and it will work immediately.
Phil, thank you for your input on this. I did get conflicting information looking for answers on the web.
I am using the HTTP Server Input plugin but you are suggesting that the issue also exists with NodeJS Server.
I tested the script with the POST method as well and in my case, I get the following error in the datamapper:
Nope I still get the same error with NodeJS and POST. Could you share the script you are using to workaround it using POST? Perhaps, I am missing something?
Ok I forgot to change the port, nodejs communicates via 9090 by default in Workflow. The error is gone, but I got a new error on the result= JSON.parse(response):
var myjson_customerInfo = {“files”:[{“customername”:“name1”, “category”:“cat2”}]};
var request = createHTTPRequest();
var my_url = “http://localhost:9090/get_customer_id?jsonInfo=” + JSON.stringify(myjson_customerInfo);
request.open(“POST”,my_url,“”,“”);
request.send();
if(request.status == “200”){
var response = request.getResponseBody().replace(/(\xFF)|(\xFE)|(\x00)/g, “”); result= JSON.parse(response);
logger.info('ajax response is: ’ + response);
result = JSON.parse(response);
}else {
logger.info('ajax response is: ’ + request.getResponseBody().replace(/(\xFF)|(\xFE)|(\x00)/g, “”));
}
Now I looked at the content of the request file and it seems the “values” field or node is empty; where as with the http server input, it was populated with the query paramater: jsonInfo; which I then use in Workflow to return the result of my query.
But because jsonInfo field isn’t passed as a subnode of the values field, my other script in Workflow receives an empty parameter and retruns an empty json, which then cases this error.
NodeJS request:
You see the difference in the values I am getting? Why is that and how Can I get the jsonInfo field as a subnode of the values node in my nodejs request file.
My daugther keeps poking me for attention as I spend far too much time on the issue. I have got to leave here for today (like they say live to fight another day, I accept defeat for today ) hoping you will figure it out. Otherwise, I will raise a ticket with your support team.
Well if you were initially running the HTTP Server, then the workaround is easier. Just go back to using the HTTP Server Input task, configured as it was before, with the same process you had originally.
Then, you need to add a single line to your data mapping script, immediately after the request.open() statement:
request.setRequestHeader("content-type","");
This deletes the Content-Type header, which in turn allows the HTTP Server to accept the request. Unfortunately, that workaround does not seem to work for the NodeJS server, so you might as well stick with the HTTP Server.
Your complete script should now look like this:
var myjson_customerInfo = {"files":[{"customername":"name1", "category":"cat2"}]};
var request = createHTTPRequest();
var my_url = "http://localhost:8000/get_customer_id?jsonInfo=" + JSON.stringify(myjson_customerInfo);
request.open("GET",my_url,"","");
request.setRequestHeader("content-type",""); // this line deletes the Content-Type header
request.send();
if(request.status == "200"){
var response = request.getResponseBody().replace(/(\xFF)|(\xFE)|(\x00)/g, "");
result= JSON.parse(response);
logger.info("ajax response is: " + response);
} else {
logger.info("ajax response is: " + request.getResponseBody().replace(/(\xFF)|(\xFE)|(\x00)/g, ""));
}