Get keys of json-object in JavaScript [duplicate]

Asked
Active3 hr before
Viewed126 times

5 Answers

duplicateobjectjavascript
90%

And I want a loop that alerts me 'person' and 'age', which are the keys of the first object in the json-Array.

var jsonData = [{
   "person": "me",
   "age": "30"
}, {
   "person": "you",
   "age": "25"
}];

for (var i in jsonData) {
   var key = i;
   var val = jsonData[i];
   for (var j in val) {
      var sub_key = j;
      var sub_val = val[j];
      console.log(sub_key);
   }
}

EDIT

var jsonObj = {
   "person": "me",
   "age": "30"
};
Object.keys(jsonObj); // returns ["person", "age"]
load more v
88%

var data = [{
   "name": "Peter",
   "age": 30,
   "hair color": "brown"
}, {
   "name": "Steve",
   "age": 55,
   "hair color": "blonde"
}, {
   "name": "Steve",
   "age": 55,
   "hair color": "blonde"
}]

data = this.data.filter((obj, pos, arr) => {
   return arr
      .map(mapObj => mapObj.name)
      .indexOf(obj.name) == pos;
});

console.log(data);
/*
[{
    "name": "Peter",
    "age": 30,
    "hair color": "brown"
}, {
    "name": "Steve",
    "age": 55,
    "hair color": "blonde"
}]
*/
load more v
72%

The validity of duplicate keys in JSON is an exception and not a rule, so this becomes a problem when it comes to actual implementations. In the typical object-oriented world, it’s not easy to work with duplicate key value pairs, and the standard does not define how a deserializer/parser should handle such objects. An implementation is free to choose its own path, and the behaviour is completely unpredictable from one library to another.,We came across a strange problem when building a new RESTful API recently. If we sent duplicate keys in our JSON request, how should the API handle it? Shouldn’t that request be rejected straight away as invalid JSON? Are duplicate keys even allowed in JSON? I did a bit of digging around to clear up this debate, and this is what I found...,This kind of difference in behaviour can be problematic particularly in modern polyglot architectures, where the behaviour of different services should ideally be as consistent as possible. It may be unlikely that such a scenario would actually occur, but if and when it does, it would definitely help to know how your applications behave and have it documented as such for your consumers and fellow developers., Best Practices for Using Kibana for Data Visualization

RFC-7159, the current standard for JSON published by the Internet Engineering Task Force (IETF), states "The names within an object SHOULD be unique". Sounds pretty clear, right? However, according to RFC-2119 which defines the terminology used in IETF documents, the word "should" in fact means "... there may exist valid reasons in particular circumstances to ignore a particular item, but the full implications must be understood and carefully weighed before choosing a different course." So, things just got a bit more confusing. What this essentially means is that while having unique keys is recommended, it is not a must. We can have duplicate keys in a JSON object, and it would still be valid.

// Despite the firstName key being repeated, this is still valid JSON
{
   "id": 001,
   "firstName": "John",
   "firstName": "Jane",
   "lastName": "Doe"
}
65%

I want to remove the repeated keys by combining their values into an array. ,I have a JSON string that I get from a database which contains repeated keys. I want to remove the repeated keys by combining their values into an array. ,As of today the org.json library version 20170516 provides accumulate() method that stores the duplicate key entries into JSONArray,I need to use Java for this. org.json throws an exception because of the repeated keys, gson can parse the string but each repeated key overwrites the last one. I need to keep all the data.

Input

{
   "a": "b",
   "c": "d",
   "c": "e",
   "f": "g"
}

Output

{
   "a": "b",
   "c": ["d", "e"],
   "f": "g"
}
load more v
75%

The names within an object SHOULD be unique.,The names within an object SHOULD be unique. If a key is duplicated, a parser SHOULD reject. If it does not reject, it MUST take only the last of the duplicated key pairs.,SHOULD reject seems too strong given that we know perfectly well we cannot do that. MAY seems much more reasonable or maybe MUST either reject or take the last of the duplicated key pairs (decision up to the parser API specification).,That doesn't make sense "should" should be "must" because if it's taking the last key then you can overwrite someone else's data.

The JSON RFC says

 The names within an object SHOULD be unique.

Sadly, there are people in the JavaScript community who interpreted SHOULD to mean DON'T HAVE TO. In a perfect world, we would change the SHOULD to MUST, but we can't. Interoperability and security concerns demand that we specify what happens when keys are duplicated. So we may end up with something like this:

 The names within an object SHOULD be unique.If a key is duplicated,
    a parser SHOULD reject.If it does not reject, it MUST take only the
 last of the duplicated key pairs.
load more v

Other "duplicate-object" queries related to "Get keys of json-object in JavaScript [duplicate]"