Structures JSON avancées : Patterns et optimisations pour développeurs
Maîtrisez les patterns JSON avancés : structures polymorphes, normalisation, références, optimisations. Guide pour architectes et développeurs expérimentés.
Big JSON Team
• Technical WriterExpert in JSON data manipulation, API development, and web technologies. Passionate about creating tools that make developers' lives easier.
# Structures JSON avancées : Patterns et optimisations pour développeurs
Ce guide explore les patterns JSON avancés pour créer des structures robustes, maintenables et performantes.
Patterns de conception
1. Polymorphisme avec discriminateurs
Type discriminator
{
"forms": [
{
"type": "text",
"name": "username",
"label": "Nom d'utilisateur",
"maxLength": 50
},
{
"type": "email",
"name": "email",
"label": "Email",
"required": true
},
{
"type": "select",
"name": "country",
"label": "Pays",
"options": ["France", "Belgique", "Suisse"]
}
]
}
TypeScript mapping
type FormField = TextInput | EmailInput | SelectInput;
interface BaseField {
name: string;
label: string;
}
interface TextInput extends BaseField {
type: 'text';
maxLength?: number;
}
interface EmailInput extends BaseField {
type: 'email';
required?: boolean;
}
interface SelectInput extends BaseField {
type: 'select';
options: string[];
}
// Type guard
function isTextInput(field: FormField): field is TextInput {
return field.type === 'text';
}
// Usage
fields.forEach(field => {
if (isTextInput(field)) {
console.log('Max length:', field.maxLength);
}
});
2. Normalisation de données
Dénormalisé (redondant)
{
"posts": [
{
"id": 1,
"title": "Premier post",
"author": {
"id": 101,
"name": "Alice",
"email": "alice@example.com"
}
},
{
"id": 2,
"title": "Deuxième post",
"author": {
"id": 101,
"name": "Alice",
"email": "alice@example.com"
}
}
]
}
Normalisé (efficace)
{
"posts": {
"1": {
"id": 1,
"title": "Premier post",
"authorId": 101
},
"2": {
"id": 2,
"title": "Deuxième post",
"authorId": 101
}
},
"authors": {
"101": {
"id": 101,
"name": "Alice",
"email": "alice@example.com"
}
}
}
Normalisation avec normalizr
import { normalize, schema } from 'normalizr';
// Définir schémas
const author = new schema.Entity('authors');
const post = new schema.Entity('posts', {
author: author
});
const postList = [post];
// Normaliser
const originalData = [
{
id: 1,
title: "Premier post",
author: { id: 101, name: "Alice" }
}
];
const normalized = normalize(originalData, postList);
console.log(normalized);
/
{
entities: {
authors: { 101: { id: 101, name: "Alice" } },
posts: { 1: { id: 1, title: "Premier post", author: 101 } }
},
result: [1]
}
/
3. Références et liens
HAL (Hypertext Application Language)
{
"_links": {
"self": { "href": "/orders/123" },
"customer": { "href": "/customers/456" },
"items": { "href": "/orders/123/items" }
},
"id": 123,
"total": 99.99,
"status": "pending"
}
JSON:API
{
"data": {
"type": "articles",
"id": "1",
"attributes": {
"title": "JSON:API paints my bikeshed!"
},
"relationships": {
"author": {
"data": { "type": "people", "id": "9" }
}
}
},
"included": [
{
"type": "people",
"id": "9",
"attributes": {
"firstName": "Alice",
"lastName": "Smith"
}
}
]
}
4. Versionning d'API
Version dans payload
{
"apiVersion": "2.0",
"data": {
"id": 123,
"name": "Product"
}
}
Gestion multi-versions
function serializeUser(user, version) {
switch (version) {
case '1.0':
return {
id: user.id,
name: user.name
};
case '2.0':
return {
id: user.id,
firstName: user.firstName,
lastName: user.lastName,
profile: {
avatar: user.avatar
}
};
default:
throw new Error(Version ${version} non supportée);
}
}
Optimisations
1. Compression structure
Avant optimisation
{
"users": [
{
"id": 1,
"name": "Alice",
"age": 30,
"city": "Paris"
},
{
"id": 2,
"name": "Bob",
"age": 25,
"city": "Lyon"
}
]
}
Après optimisation (colonnes)
{
"columns": ["id", "name", "age", "city"],
"data": [
[1, "Alice", 30, "Paris"],
[2, "Bob", 25, "Lyon"]
]
}
Réduction : ~40% taille.
Conversion
// Compresser
function compressToColumns(objects) {
if (objects.length === 0) return { columns: [], data: [] };
const columns = Object.keys(objects[0]);
const data = objects.map(obj => columns.map(col => obj[col]));
return { columns, data };
}
// Décompresser
function decompressFromColumns({ columns, data }) {
return data.map(row => {
const obj = {};
columns.forEach((col, i) => {
obj[col] = row[i];
});
return obj;
});
}
2. Clés courtes
Avant
{
"userIdentifier": "abc123",
"emailAddress": "user@example.com",
"phoneNumber": "+33123456789"
}
Après
{
"uid": "abc123",
"email": "user@example.com",
"phone": "+33123456789"
}
Mapping
const keyMap = {
uid: 'userIdentifier',
email: 'emailAddress',
phone: 'phoneNumber'
};
function expandKeys(compressed) {
const expanded = {};
Object.entries(compressed).forEach(([short, value]) => {
const long = keyMap[short] || short;
expanded[long] = value;
});
return expanded;
}
3. Omission valeurs par défaut
// Avant
{
"enabled": true,
"visible": true,
"autoSave": false,
"theme": "light"
}
// Après (défauts: enabled=true, visible=true, autoSave=false, theme="light")
{
"autoSave": false // Seulement si différent du défaut
}
const defaults = {
enabled: true,
visible: true,
autoSave: false,
theme: 'light'
};
function stripDefaults(obj) {
const result = {};
Object.entries(obj).forEach(([key, value]) => {
if (value !== defaults[key]) {
result[key] = value;
}
});
return result;
}
function applyDefaults(obj) {
return { ...defaults, ...obj };
}
4. Encodage binaire
Pour données très volumineuses, considérer :
- Protocol Buffers (protobuf)
- MessagePack
- BSON
- Apache Avro
// MessagePack exemple
const msgpack = require('msgpack-lite');
const data = { name: "Alice", age: 30 };
// Encoder
const packed = msgpack.encode(data);
console.log('JSON size:', JSON.stringify(data).length); // 24
console.log('MsgPack size:', packed.length); // 15
// Décoder
const unpacked = msgpack.decode(packed);
Patterns de validation
Schema composition
{
"$schema": "http://json-schema.org/draft-07/schema#",
"definitions": {
"address": {
"type": "object",
"properties": {
"street": { "type": "string" },
"city": { "type": "string" },
"zipCode": { "type": "string" }
},
"required": ["street", "city"]
}
},
"type": "object",
"properties": {
"name": { "type": "string" },
"homeAddress": { "$ref": "#/definitions/address" },
"workAddress": { "$ref": "#/definitions/address" }
}
}
Validation conditionnelle
{
"type": "object",
"properties": {
"country": { "type": "string" },
"postalCode": { "type": "string" }
},
"if": {
"properties": { "country": { "const": "France" } }
},
"then": {
"properties": {
"postalCode": {
"pattern": "^\\d{5}$"
}
}
},
"else": {
"properties": {
"postalCode": {
"type": "string"
}
}
}
}
Gestion d'état
Event Sourcing
{
"events": [
{
"type": "UserCreated",
"timestamp": "2026-01-16T10:00:00Z",
"data": {
"userId": 123,
"name": "Alice"
}
},
{
"type": "EmailUpdated",
"timestamp": "2026-01-16T11:00:00Z",
"data": {
"userId": 123,
"email": "alice@example.com"
}
},
{
"type": "UserDeleted",
"timestamp": "2026-01-16T12:00:00Z",
"data": {
"userId": 123
}
}
]
}
function replayEvents(events) {
const state = {};
events.forEach(event => {
switch (event.type) {
case 'UserCreated':
state[event.data.userId] = { ...event.data };
break;
case 'EmailUpdated':
state[event.data.userId].email = event.data.email;
break;
case 'UserDeleted':
delete state[event.data.userId];
break;
}
});
return state;
}
CQRS (Command Query Responsibility Segregation)
{
"commands": {
"CreateUser": {
"schema": {
"name": "string",
"email": "string"
}
},
"UpdateEmail": {
"schema": {
"userId": "number",
"email": "string"
}
}
},
"queries": {
"GetUser": {
"params": { "userId": "number" },
"returns": "User"
},
"ListUsers": {
"params": { "page": "number" },
"returns": "User[]"
}
}
}
Streaming et pagination
Cursor-based pagination
{
"data": [
{ "id": 1, "name": "Item 1" },
{ "id": 2, "name": "Item 2" }
],
"paging": {
"cursors": {
"before": "MTAxNTExOTQ1MjAwNzI5NDE=",
"after": "NDMyNzQyODI3OTQw"
},
"next": "https://api.example.com/items?after=NDMyNzQyODI3OTQw"
}
}
Infinite scroll
async function* fetchAllPages(baseUrl) {
let nextUrl = baseUrl;
while (nextUrl) {
const response = await fetch(nextUrl);
const data = await response.json();
yield data.data;
nextUrl = data.paging?.next;
}
}
// Usage
for await (const items of fetchAllPages('/api/items')) {
items.forEach(item => console.log(item));
}
GraphQL-like queries
Query DSL
{
"query": {
"user": {
"fields": ["id", "name", "email"],
"where": { "age": { "gt": 18 } },
"include": {
"posts": {
"fields": ["id", "title"],
"limit": 5
}
}
}
}
}
Implémentation
function executeQuery(query, data) {
const result = {};
Object.entries(query).forEach(([entity, spec]) => {
let items = data[entity];
// Filtrer
if (spec.where) {
items = items.filter(item => matchesWhere(item, spec.where));
}
// Sélectionner fields
items = items.map(item => {
const selected = {};
spec.fields.forEach(field => {
selected[field] = item[field];
});
return selected;
});
// Include relations
if (spec.include) {
items = items.map(item => ({
...item,
...executeQuery(spec.include, data)
}));
}
result[entity] = items;
});
return result;
}
Meilleures pratiques architecturales
Conclusion
Patterns essentiels :- Polymorphisme avec discriminateurs
- Normalisation pour efficacité
- Références pour relations
- Compression pour performance
- Structure colonnes pour gros volumes
- Clés courtes pour réduction taille
- Omission valeurs par défaut
- Formats binaires si nécessaire
- Event sourcing pour historique
- CQRS pour séparation
- Pagination cursor-based
- Validation avec JSON Schema
Maîtrisez ces patterns pour des APIs robustes et performantes !
Articles Connexes
Comprendre JSON Schema : Validation et documentation des données
Maîtrisez JSON Schema pour valider vos données JSON. Guide complet avec exemples pratiques, types, validations et outils pour créer des schémas robustes.
JSON en science des données : Analyse, transformation et machine learning
Découvrez comment utiliser JSON en science des données avec Python, pandas, et outils d'analyse. Guide pour data scientists et analystes.
Travailler avec de gros fichiers JSON : Techniques et optimisations
Guide complet pour gérer efficacement de gros fichiers JSON : streaming, parsing incrémental, optimisations mémoire et performance.