I am using "System.Text.Json". JSON file contains multiple comma separated values, the code is only working for one block of Json Data. If I use the whole data, it does not parse and throws me an exception about '[' and ','. Which should not be the case, Am I missing something? Please see the below code and Json Data
Here's my code for the class:
Public Class query
Public Property RN As String
Public Property QUERY_ID As String
Public Property QTYPE As String
Public Property SENDERNAME As String
Public Property SENDE
Public Property SUBJECT As String
Public Property DATE_RE As String
Public Property DATE_R As String
Public Property DATE_TIME_RE As String
Public Property GLUSR_USR_COMPANYNAME As String
Public Property READ_STATUS As Object
Public Property SENDER_GLUSR_USR_ID As Object
Public Property MOB As String
Public Property COUNTRY_FLAG As String
Public Property QUERY_MODID As String
Public Property LOG_TIME As String
Public Property QUERY_MODREFID As Object
Public Property DIR_QUERY_MODREF_TYPE As Object
Public Property ORG_SENDER_GLUSR_ID As Object
Public Property ENQ_MESSAGE As String
Public Property ENQ_ADDRESS As String
Public Property ENQ_CALL_DURATION As Object
Public Property ENQ_RECEIVER_MOB As Object
Public Property ENQ_CITY As String
Public Property ENQ_STATE As String
Public Property PRODUCT_NAME As String
Public Property COUNTRY_ISO As String
Public Property EMAIL_ALT As String
Public Property MOBILE_ALT As String
Public Property PHONE As Object
Public Property PHONE_ALT As Object
Public Property IM_MEMBER_SINCE As Object
Public Property TOTAL_COUNT As String
End Class
Main Form code:
Imports System.IO
Imports System.Text.Json
Imports System.Text.Json.Serialization
Public Class Form1
Private Sub Button1_Click(sender As Object, e As EventArgs) Handles Button1.Click
Dim json As String =
File.ReadAllText("~\test2.txt")
Dim feed = JsonSerializer.Deserialize(Of query)(json)
MsgBox(feed.ENQ_MESSAGE)
End Sub
End Class
Here's the JSON Data
[
{
"RN": "1",
"QUERY_ID": "1519852833",
"QTYPE": "W",
"SENDERNAME": "Name1",
"SENDEREMAIL": "xyz@gmail.com",
"SUBJECT": "Requirement for Black PP Granules",
"DATE_RE": "30 Sep 2020",
"DATE_R": "30-Sep-20",
"DATE_TIME_RE": "30-Sep-2020 11:34:46 PM",
"GLUSR_USR_COMPANYNAME": "Company Name1",
"READ_STATUS": null,
"SENDER_GLUSR_USR_ID": null,
"MOB": "+91-1111111111",
"COUNTRY_FLAG": "",
"QUERY_MODID": "DIRECT",
"LOG_TIME": "20200930233446",
"QUERY_MODREFID": null,
"DIR_QUERY_MODREF_TYPE": null,
"ORG_SENDER_GLUSR_ID": null,
"ENQ_MESSAGE": "My Requirement is for Black PP Granules.",
"ENQ_ADDRESS": "Address1",
"ENQ_CALL_DURATION": null,
"ENQ_RECEIVER_MOB": null,
"ENQ_CITY": "Panipat",
"ENQ_STATE": "Haryana",
"PRODUCT_NAME": "Black PP Granules",
"COUNTRY_ISO": "IN",
"EMAIL_ALT": "xyz11@gmail.com",
"MOBILE_ALT": "+91-11111111111",
"PHONE": null,
"PHONE_ALT": null,
"IM_MEMBER_SINCE": null,
"TOTAL_COUNT": "178"
},
{
"RN": "2",
"QUERY_ID": "1519834488",
"QTYPE": "W",
"SENDERNAME": "Name2",
"SENDEREMAIL": "xyz2@gmail.com",
"SUBJECT": "Requirement for SHRILON Nylon Granules",
"DATE_RE": "30 Sep 2020",
"DATE_R": "30-Sep-20",
"DATE_TIME_RE": "30-Sep-2020 11:04:12 PM",
"GLUSR_USR_COMPANYNAME": "Company Name2",
"READ_STATUS": null,
"SENDER_GLUSR_USR_ID": null,
"MOB": "+91-22222222222",
"COUNTRY_FLAG": "",
"QUERY_MODID": "DIRECT",
"LOG_TIME": "20200930230412",
"QUERY_MODREFID": null,
"DIR_QUERY_MODREF_TYPE": null,
"ORG_SENDER_GLUSR_ID": null,
"ENQ_MESSAGE": "Requirement2",
"ENQ_ADDRESS": "Address2",
"ENQ_CALL_DURATION": null,
"ENQ_RECEIVER_MOB": null,
"ENQ_CITY": "Chennai",
"ENQ_STATE": "Tamil Nadu",
"PRODUCT_NAME": "SHRILON Nylon Granules",
"COUNTRY_ISO": "IN",
"EMAIL_ALT": null,
"MOBILE_ALT": "+912222222",
"PHONE": null,
"PHONE_ALT": null,
"IM_MEMBER_SINCE": null,
"TOTAL_COUNT": "178"
}
]
What I have tried:
I have tried confirming Json Data format. Json is correct and valid (RFC8259). To get this working, I have to remove '[]' and use just one block of data within '{}'. If I use the whole data, it does not parse and throws me an exception about '[' and ','.
I have also tried Newtonsoft Json and experienced the same problem.