Campaign a2a-hermes-v0.6.0-r14 FAIL
Infrastructure
Node roster
| # | Role | Agent ID | Public IP | Private IP |
|---|---|---|---|---|
| 1 | agent | ai:alice | 157.245.219.205 | 10.252.0.5 |
| 2 | agent | ai:bob | 167.172.234.175 | 10.252.0.2 |
| 3 | agent | ai:charlie | 68.183.145.110 | 10.252.0.4 |
| 4 | memory-only | — | 167.71.181.168 | 10.252.0.3 |
Baseline attestation BASELINE OK
Per the authoritative baseline spec, every agent node must emit a self-attestation before any scenario is permitted to run. This run's attestation:
Spec version: 1.2.0 — see authoritative baseline.
| Node | Agent | Framework | Authentic | MCP ai-memory | xAI cfg | xAI default | Agent ID | Federation | UFW off | iptables | dead-man | F1 xAI | F2a substrate | F2b agent (non-gating) | Config SHA | Pass |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| node-1 | ai:alice | hermes Hermes Agent v0.10.0 (2026.4.16) | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | — | fa358f9a9059 | PASS |
| node-2 | ai:bob | hermes Hermes Agent v0.10.0 (2026.4.16) | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | — | 21635cf63640 | PASS |
| node-3 | ai:charlie | hermes Hermes Agent v0.10.0 (2026.4.16) | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ | — | ce52d772ef5a | PASS |
a2a-baseline.json
{
"baseline_pass": true,
"per_node": [
{
"spec_version": "1.2.0",
"agent_type": "hermes",
"agent_id": "ai:alice",
"node_index": "1",
"framework_version": "Hermes Agent v0.10.0 (2026.4.16)",
"ai_memory_version": "0.6.0",
"peer_urls": "http://10.252.0.2:9077,http://10.252.0.4:9077,http://10.252.0.3:9077",
"config_file_sha256": "fa358f9a90597243fb96224babd541399bd7b1e972f364605308ab1e2d9dd2c7",
"config_attestation": {
"framework_is_authentic": true,
"mcp_server_ai_memory_registered": true,
"llm_backend_is_xai_grok": true,
"llm_is_default_provider": true,
"mcp_command_is_ai_memory": true,
"agent_id_stamped": true,
"federation_live": true,
"ufw_disabled": true,
"iptables_flushed": true,
"dead_man_switch_scheduled": true
},
"negative_invariants": {
"_description": "Alternative A2A channels must be OFF so a passing scenario is only passing via ai-memory shared memory. Any true here = thesis-preserving.",
"a2a_protocol_off": true,
"sub_agent_or_sessions_spawn_off": true,
"alternative_channels_off": true,
"tool_allowlist_is_memory_only": true,
"a2a_gate_profile_locked": true
},
"functional_probes": {
"xai_grok_chat_reachable": true,
"xai_grok_sample_reply": "READY",
"substrate_http_canary_f2a": true,
"substrate_http_canary_uuid": "f028b7f1-8f94-4cb7-9d57-5e1224aa1073",
"agent_mcp_canary_f2b": false,
"agent_mcp_canary_uuid": "ecf89ed9-a3ed-443a-a904-c2f59a34b133",
"agent_canary_response_head": "Traceback (most recent call last): File \"/usr/local/bin/hermes\", line 11, in <module> main() File \"/root/.hermes/hermes-agent/hermes_cli/main.py\", line 8487, in main args.func(args) File \"/root/.hermes/hermes-agent/hermes_cli/main.py\", line 1071, in cmd_chat if not _has_any_provider_configured(): ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File \"/root/.hermes/hermes-agent/hermes_cli/main.py\", line 209, in _has_any_provider_configured from hermes_cli.auth import get_auth_sta",
"_f2b_note": "F2b is LLM-dependent and non-blocking. F2a (deterministic HTTP substrate) gates baseline_pass.",
"agent_mcp_ai_memory_canary": true,
"canary_uuid": "f028b7f1-8f94-4cb7-9d57-5e1224aa1073",
"canary_namespace": "_baseline_canary_f2a"
},
"baseline_pass": true
},
{
"spec_version": "1.2.0",
"agent_type": "hermes",
"agent_id": "ai:bob",
"node_index": "2",
"framework_version": "Hermes Agent v0.10.0 (2026.4.16)",
"ai_memory_version": "0.6.0",
"peer_urls": "http://10.252.0.5:9077,http://10.252.0.4:9077,http://10.252.0.3:9077",
"config_file_sha256": "21635cf6364057fd2a004d28aac89abf8438671d85f9fd2ed1e654d812d23ff1",
"config_attestation": {
"framework_is_authentic": true,
"mcp_server_ai_memory_registered": true,
"llm_backend_is_xai_grok": true,
"llm_is_default_provider": true,
"mcp_command_is_ai_memory": true,
"agent_id_stamped": true,
"federation_live": true,
"ufw_disabled": true,
"iptables_flushed": true,
"dead_man_switch_scheduled": true
},
"negative_invariants": {
"_description": "Alternative A2A channels must be OFF so a passing scenario is only passing via ai-memory shared memory. Any true here = thesis-preserving.",
"a2a_protocol_off": true,
"sub_agent_or_sessions_spawn_off": true,
"alternative_channels_off": true,
"tool_allowlist_is_memory_only": true,
"a2a_gate_profile_locked": true
},
"functional_probes": {
"xai_grok_chat_reachable": true,
"xai_grok_sample_reply": "READY",
"substrate_http_canary_f2a": true,
"substrate_http_canary_uuid": "903fb38f-b04c-44df-bc52-82f54d57880b",
"agent_mcp_canary_f2b": false,
"agent_mcp_canary_uuid": "85411d61-9049-475a-bbc3-8c3d3fdcbdfc",
"agent_canary_response_head": "Traceback (most recent call last): File \"/usr/local/bin/hermes\", line 11, in <module> main() File \"/root/.hermes/hermes-agent/hermes_cli/main.py\", line 8487, in main args.func(args) File \"/root/.hermes/hermes-agent/hermes_cli/main.py\", line 1071, in cmd_chat if not _has_any_provider_configured(): ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File \"/root/.hermes/hermes-agent/hermes_cli/main.py\", line 209, in _has_any_provider_configured from hermes_cli.auth import get_auth_sta",
"_f2b_note": "F2b is LLM-dependent and non-blocking. F2a (deterministic HTTP substrate) gates baseline_pass.",
"agent_mcp_ai_memory_canary": true,
"canary_uuid": "903fb38f-b04c-44df-bc52-82f54d57880b",
"canary_namespace": "_baseline_canary_f2a"
},
"baseline_pass": true
},
{
"spec_version": "1.2.0",
"agent_type": "hermes",
"agent_id": "ai:charlie",
"node_index": "3",
"framework_version": "Hermes Agent v0.10.0 (2026.4.16)",
"ai_memory_version": "0.6.0",
"peer_urls": "http://10.252.0.5:9077,http://10.252.0.2:9077,http://10.252.0.3:9077",
"config_file_sha256": "ce52d772ef5a00968db29fb80eea7a14206b0a258a00ff2165db725405474618",
"config_attestation": {
"framework_is_authentic": true,
"mcp_server_ai_memory_registered": true,
"llm_backend_is_xai_grok": true,
"llm_is_default_provider": true,
"mcp_command_is_ai_memory": true,
"agent_id_stamped": true,
"federation_live": true,
"ufw_disabled": true,
"iptables_flushed": true,
"dead_man_switch_scheduled": true
},
"negative_invariants": {
"_description": "Alternative A2A channels must be OFF so a passing scenario is only passing via ai-memory shared memory. Any true here = thesis-preserving.",
"a2a_protocol_off": true,
"sub_agent_or_sessions_spawn_off": true,
"alternative_channels_off": true,
"tool_allowlist_is_memory_only": true,
"a2a_gate_profile_locked": true
},
"functional_probes": {
"xai_grok_chat_reachable": true,
"xai_grok_sample_reply": "READY",
"substrate_http_canary_f2a": true,
"substrate_http_canary_uuid": "9f89dce1-64c5-4777-aa3a-620221a21431",
"agent_mcp_canary_f2b": false,
"agent_mcp_canary_uuid": "121baf7b-9271-4d5f-ab23-2375e6232f8d",
"agent_canary_response_head": "Traceback (most recent call last): File \"/usr/local/bin/hermes\", line 11, in <module> main() File \"/root/.hermes/hermes-agent/hermes_cli/main.py\", line 8487, in main args.func(args) File \"/root/.hermes/hermes-agent/hermes_cli/main.py\", line 1071, in cmd_chat if not _has_any_provider_configured(): ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File \"/root/.hermes/hermes-agent/hermes_cli/main.py\", line 209, in _has_any_provider_configured from hermes_cli.auth import get_auth_sta",
"_f2b_note": "F2b is LLM-dependent and non-blocking. F2a (deterministic HTTP substrate) gates baseline_pass.",
"agent_mcp_ai_memory_canary": true,
"canary_uuid": "9f89dce1-64c5-4777-aa3a-620221a21431",
"canary_namespace": "_baseline_canary_f2a"
},
"baseline_pass": true
}
]
}
F3 — peer A2A via shared memory F3 OK
Workflow-level probe answering "can agents communicate through ai-memory?". Writer ai:alice posted canary UUID 91e8dcc0-8031-438c-95c4-1e47bcb59b17 to namespace _baseline_peer_canary via node-1's local ai-memory serve HTTP. After W=2 fanout settle, probe confirmed the canary on each of the 3 peer nodes via their local GET /api/v1/memories.
f3-peer-a2a.json
{
"probe": "F3",
"name": "peer-a2a-via-shared-memory",
"description": "Writer agent posts a canary via local ai-memory HTTP on node-1; verifies the row propagates to the 3 peer nodes (W=2/N=4 quorum) before scenarios run.",
"canary_uuid": "91e8dcc0-8031-438c-95c4-1e47bcb59b17",
"canary_namespace": "_baseline_peer_canary",
"writer_agent": "ai:alice",
"pass": true
}
AI NHI analysis
No per-campaign narrative recorded yet. scripts/analyze_run.py will generate one on the next dispatch.
Tests performed in this run
Every scenario that produced a JSON report in this campaign, in testbook order. Click a row's scenario id to jump to its full report below. See the Every test performed page for the authoritative catalog.
| ID | Title | Result | Reason |
|---|---|---|---|
| S1 | Per-agent write + read (MCP stdio) | ? | |
| S1b | Per-agent write + read (HTTP) | PASS | |
| S2 | Shared-context handoff | PASS | |
| S4 | Federation-aware concurrent writes | PASS | |
| S5 | Consolidation + curation | ? | |
| S6 | Contradiction detection | ? | |
| S9 | Mutation round-trip | ? | |
| S10 | Deletion propagation | ? | |
| S11 | Link integrity | ? | |
| S12 | Agent registration | ? | |
| S13 | Concurrent write contention | ? | |
| S14 | Partition tolerance | ? | |
| S15 | Read-your-writes | PASS | |
| S16 | Tier promotion | ? | |
| S17 | Stats consistency | ? | |
| S18 | Semantic query expansion | ? |
Scenario 1 — Per-agent write + read (MCP stdio) UNKNOWN
scenario-1.json (report)
scenario-1.log (console trace)
[scenario-1 hermes] phase A: each agent writes 10 memories via MCP
[scenario-1 hermes] agent ai:alice on node-a (157.245.219.205)
Traceback (most recent call last):
File "/usr/local/bin/hermes", line 11, in <module>
main()
File "/root/.hermes/hermes-agent/hermes_cli/main.py", line 8487, in main
args.func(args)
File "/root/.hermes/hermes-agent/hermes_cli/main.py", line 1071, in cmd_chat
if not _has_any_provider_configured():
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/.hermes/hermes-agent/hermes_cli/main.py", line 209, in _has_any_provider_configured
from hermes_cli.auth import get_auth_status
File "/root/.hermes/hermes-agent/hermes_cli/auth.py", line 38, in <module>
import httpx
ModuleNotFoundError: No module named 'httpx'
Scenario 1b — Per-agent write + read (HTTP) PASS
scenario-1b.json (report)
{
"scenario": "1b",
"pass": true,
"agent_group": "hermes",
"path": "serve-http",
"expected_per_reader": 20,
"per_agent": {
"ai:alice": {
"recall": 20
},
"ai:bob": {
"recall": 20
},
"ai:charlie": {
"recall": 20
}
},
"reasons": []
}
scenario-1b.log (console trace)
[scenario-1b hermes] phase A: each agent POSTs 10 memories to local serve [scenario-1b hermes] agent ai:alice on node-a (157.245.219.205) [scenario-1b hermes] agent ai:bob on node-b (167.172.234.175) [scenario-1b hermes] agent ai:charlie on node-c (68.183.145.110) [scenario-1b hermes] settle 15s for W=2/N=4 convergence [scenario-1b hermes] phase B: each reader counts rows in the OTHER two namespaces via local serve [scenario-1b hermes] ai:alice sees 20 rows from the other two namespaces [scenario-1b hermes] ai:bob sees 20 rows from the other two namespaces [scenario-1b hermes] ai:charlie sees 20 rows from the other two namespaces [scenario-1b hermes] phase C: cross-cluster identity verification via node-4 [scenario-1b hermes] ns=scenario1b-ai:alice count=10 wrong_agent_id=0 [scenario-1b hermes] ns=scenario1b-ai:bob count=10 wrong_agent_id=0 [scenario-1b hermes] ns=scenario1b-ai:charlie count=10 wrong_agent_id=0
Scenario 2 — Shared-context handoff PASS
scenario-2.json (report)
{
"scenario": "2",
"pass": true,
"agent_group": "hermes",
"path": "serve-http",
"per_agent": {
"ai:bob": {
"sees_handoff": 1
},
"ai:alice": {
"sees_ack": 1
}
},
"handoff_uuid": "0336ff7c-dd35-4e4c-b69b-3671def65e6a",
"ack_uuid": "eca0b1a5-4233-49f6-8fae-6c7792d8deb4",
"reasons": []
}
scenario-2.log (console trace)
[scenario-2 hermes] phase A: ai:alice writes handoff to ai:bob (uuid=0336ff7c-dd35-4e4c-b69b-3671def65e6a) [scenario-2 hermes] settle 8s for quorum fanout [scenario-2 hermes] phase B: ai:bob reads handoff on node-2 [scenario-2 hermes] ai:bob sees 1 handoff memories from ai:alice [scenario-2 hermes] phase C: ai:bob writes acknowledgement (uuid=eca0b1a5-4233-49f6-8fae-6c7792d8deb4) [scenario-2 hermes] settle 8s for reverse-direction fanout [scenario-2 hermes] phase D: ai:alice reads ack on node-1 [scenario-2 hermes] ai:alice sees 1 ack memories from ai:bob
Scenario 4 — Federation-aware concurrent writes PASS
scenario-4.json (report)
{
"scenario": "4",
"pass": true,
"agent_group": "hermes",
"expected_per_agent": 30,
"per_agent": {
"ai:alice": {
"count": 30,
"wrong_agent_id": 0
},
"ai:bob": {
"count": 30,
"wrong_agent_id": 0
},
"ai:charlie": {
"count": 30,
"wrong_agent_id": 0
}
},
"reasons": []
}
scenario-4.log (console trace)
[scenario-4 hermes] phase A: launching concurrent 30-row bursts from 3 agents [scenario-4 hermes] all 3 bursts complete; settle 20s for W=2 fanout convergence [scenario-4 hermes] phase B: querying node-4 aggregator for per-agent counts [scenario-4 hermes] ai:alice: count=30 (expected 30) wrong_agent_id=0 [scenario-4 hermes] ai:bob: count=30 (expected 30) wrong_agent_id=0 [scenario-4 hermes] ai:charlie: count=30 (expected 30) wrong_agent_id=0
Scenario 5 — Consolidation + curation FAIL
Reasons: consolidate endpoint returned HTTP 405 — may not exist in this ai-memory version | consolidate did not return a new memory id
scenario-5.json (report)
{
"scenario": "5",
"pass": false,
"agent_group": "hermes",
"consolidated_id": "",
"consolidate_http_code": "405",
"consolidated_from_agents": "",
"reasons": [
"consolidate endpoint returned HTTP 405 — may not exist in this ai-memory version",
"consolidate did not return a new memory id"
]
}
scenario-5.log (console trace)
[scenario-5 hermes] phase A: each agent writes 3 related memories [scenario-5 hermes] ai:alice on 157.245.219.205 [scenario-5 hermes] ai:bob on 167.172.234.175 [scenario-5 hermes] ai:charlie on 68.183.145.110 [scenario-5 hermes] phase B: trigger memory_consolidate on node-1 [scenario-5 hermes] consolidate returned HTTP 405 [scenario-5 hermes] consolidated memory id= [scenario-5 hermes] phase C: verifying consolidated_from_agents on node-4 [scenario-5 hermes] consolidated_from_agents=
Scenario 6 — Contradiction detection FAIL
Reasons: detect_contradiction endpoint returned HTTP 404 — may not exist in this ai-memory version | response did not include both memories | response did not include a contradicts relation
scenario-6.json (report)
{
"scenario": "6",
"pass": false,
"agent_group": "hermes",
"topic": "sky-color-0e2fdf4c",
"alice_id": "2be31495-67c1-4d76-b691-a3ce44210c94",
"bob_id": "2be31495-67c1-4d76-b691-a3ce44210c94",
"detect_http_code": "404",
"charlie_sees_both_memories": 0,
"charlie_sees_contradicts_link": 0,
"reasons": [
"detect_contradiction endpoint returned HTTP 404 — may not exist in this ai-memory version",
"response did not include both memories",
"response did not include a contradicts relation"
]
}
scenario-6.log (console trace)
[scenario-6 hermes] alice writes claim: "sky-color-0e2fdf4c is blue" on node-1 [scenario-6 hermes] bob writes contradicting claim: "sky-color-0e2fdf4c is red" on node-2 [scenario-6 hermes] alice.id=2be31495-67c1-4d76-b691-a3ce44210c94 bob.id=2be31495-67c1-4d76-b691-a3ce44210c94 [scenario-6 hermes] charlie queries memory_detect_contradiction on node-3 [scenario-6 hermes] HTTP 404; body length=0 [scenario-6 hermes] sees both memories: 0; sees contradicts link: 0
Scenario 9 — Mutation round-trip FAIL
Reasons: charlie expected content=601ba75b-2873-4b81-ac9d-aab507640133 got "" (update didn't propagate or PATCH unsupported) | metadata.agent_id changed from ai:alice to "" — Task 1.2 immutability breach
scenario-9.json (report)
{
"scenario": "9",
"pass": false,
"agent_group": "hermes",
"m1_id": "a529116a-8f3d-4a64-9117-6a80dbf6ff7a",
"v1_uuid": "b444df51-0306-460d-ab52-1b038ec03618",
"v2_uuid": "601ba75b-2873-4b81-ac9d-aab507640133",
"patch_http_code": "405",
"charlie_view": {
"content": "",
"agent_id": ""
},
"reasons": [
"charlie expected content=601ba75b-2873-4b81-ac9d-aab507640133 got \"\" (update didn't propagate or PATCH unsupported)",
"metadata.agent_id changed from ai:alice to \"\" — Task 1.2 immutability breach"
]
}
scenario-9.log (console trace)
[scenario-9 hermes] alice writes M1 content=b444df51-0306-460d-ab52-1b038ec03618 on node-1 [scenario-9 hermes] created memory id=a529116a-8f3d-4a64-9117-6a80dbf6ff7a [scenario-9 hermes] bob updates M1 content=601ba75b-2873-4b81-ac9d-aab507640133 on node-2 [scenario-9 hermes] PATCH returned HTTP 405 [scenario-9 hermes] charlie reads M1 on node-3 and checks content + provenance [scenario-9 hermes] charlie sees content="" agent_id=""
Scenario 10 — Deletion propagation FAIL
Reasons: 3/3 peers still see M1 after delete — tombstone not propagated
scenario-10.json (report)
{
"scenario": "10",
"pass": false,
"agent_group": "hermes",
"m1_id": "3ac9d19f-c921-484f-bfc7-a6b203594820",
"uuid": "e8078d3b-9f1c-4c8d-9506-c0231ba82a62",
"delete_http_code": "200",
"pre_delete_visible_peers": 3,
"post_delete_still_visible_peers": 3,
"reasons": [
"3/3 peers still see M1 after delete — tombstone not propagated"
]
}
scenario-10.log (console trace)
[scenario-10 hermes] alice writes M1 content=e8078d3b-9f1c-4c8d-9506-c0231ba82a62 on node-1 [scenario-10 hermes] created memory id=3ac9d19f-c921-484f-bfc7-a6b203594820 [scenario-10 hermes] pre-delete: verifying M1 is visible on all peers [scenario-10 hermes] pre-delete node-2 sees 1 [scenario-10 hermes] pre-delete node-3 sees 1 [scenario-10 hermes] pre-delete node-4 sees 1 [scenario-10 hermes] alice deletes M1 on node-1 [scenario-10 hermes] DELETE returned HTTP 200 [scenario-10 hermes] post-delete: verifying M1 is GONE from all peers [scenario-10 hermes] post-delete node-2 sees 1 (expected 0) [scenario-10 hermes] post-delete node-3 sees 1 (expected 0) [scenario-10 hermes] post-delete node-4 sees 1 (expected 0)
Scenario 11 — Link integrity FAIL
Reasons: link POST returned HTTP 404 — endpoint may not exist in this ai-memory version | charlie could not see M1->M2 link after settle
scenario-11.json (report)
{
"scenario": "11",
"pass": false,
"agent_group": "hermes",
"m1_id": "27bac32c-9a46-4770-b68d-15f19bad393d",
"m2_id": "a4c1fa9c-fd0d-4345-b8ac-d892924be426",
"relation": "related_to",
"link_http_code": "404",
"charlie_sees_link": 0,
"reasons": [
"link POST returned HTTP 404 — endpoint may not exist in this ai-memory version",
"charlie could not see M1->M2 link after settle"
]
}
scenario-11.log (console trace)
[scenario-11 hermes] alice writes M1 on node-1 [scenario-11 hermes] bob writes M2 on node-2 [scenario-11 hermes] M1=27bac32c-9a46-4770-b68d-15f19bad393d M2=a4c1fa9c-fd0d-4345-b8ac-d892924be426 [scenario-11 hermes] alice links M1 -> M2 with relation=related_to [scenario-11 hermes] link POST returned HTTP 404 [scenario-11 hermes] charlie queries links of M1 on node-3 [scenario-11 hermes] charlie sees M1->M2 link: 0 (expected >=1)
Scenario 12 — Agent registration UNKNOWN
scenario-12.json (report)
scenario-12.log (console trace)
[scenario-12 hermes] alice registers new agent ai:dave-probe-3b1f2786 on node-1 [scenario-12 hermes] POST /api/v1/agents returned HTTP 422
Scenario 13 — Concurrent write contention FAIL
Reasons: winning content is not one of the submitted PATCH values: got "(none)"
scenario-13.json (report)
{
"scenario": "13",
"pass": false,
"agent_group": "hermes",
"m1_id": "2b58b0f7-0d7f-425e-9685-52e2e83957a0",
"submitted": {
"v0": "8c2af31f-427f-4b8b-92e3-671ec11a4c9c",
"vA_alice": "11835b27-bb8c-4614-841f-e927accc847f",
"vB_bob": "8c67014c-b589-47f7-90f5-362d5af8b178"
},
"peer_view": {
"node_1": "(none)",
"node_2": "(none)",
"node_3": "(none)",
"node_4": "(none)"
},
"reasons": [
"winning content is not one of the submitted PATCH values: got \"(none)\""
]
}
scenario-13.log (console trace)
[scenario-13 hermes] alice writes M1 content=v0 on node-1 [scenario-13 hermes] M1 id=2b58b0f7-0d7f-425e-9685-52e2e83957a0 [scenario-13 hermes] alice + bob issue concurrent PATCHes (vA=11835b27-bb8c-4614-841f-e927accc847f from alice, vB=8c67014c-b589-47f7-90f5-362d5af8b178 from bob) [scenario-13 hermes] settle 10s for quorum convergence [scenario-13 hermes] node-1 sees content=(none) [scenario-13 hermes] node-2 sees content=(none) [scenario-13 hermes] node-3 sees content=(none) [scenario-13 hermes] node-4 sees content=(none)
Scenario 14 — Partition tolerance FAIL
Reasons: node-3 only saw 2/20 writes after partition recovery — catchup failed or W=2 wasn't satisfied during outage
scenario-14.json (report)
{
"scenario": "14",
"pass": false,
"agent_group": "hermes",
"partition_target": "node-3",
"expected_post_recovery": 20,
"node3_saw": 2,
"reasons": [
"node-3 only saw 2/20 writes after partition recovery — catchup failed or W=2 wasn't satisfied during outage"
]
}
scenario-14.log (console trace)
[scenario-14 hermes] suspending ai-memory on node-3 (SIGSTOP) [scenario-14 hermes] writing 10 memories each from alice + bob during node-3 outage [scenario-14 hermes] resuming ai-memory on node-3 (SIGCONT) [scenario-14 hermes] settle 20s for post-partition catchup [scenario-14 hermes] checking node-3 caught up [scenario-14 hermes] node-3 sees 2 memories in scenario14-partition (expected 20)
Scenario 15 — Read-your-writes PASS
scenario-15.json (report)
{
"scenario": "15",
"pass": true,
"agent_group": "hermes",
"uuid": "52f404cc-35d5-4148-94eb-7f9f9dd48cb3",
"writer_sees_own_write": 1,
"reasons": []
}
scenario-15.log (console trace)
[scenario-15 hermes] alice writes + immediately reads M1 on node-1 (uuid=52f404cc-35d5-4148-94eb-7f9f9dd48cb3) [scenario-15 hermes] alice sees 1 (expected 1) immediately after write
Scenario 16 — Tier promotion FAIL
Reasons: bob sees tier="(missing)", expected "long"
scenario-16.json (report)
{
"scenario": "16",
"pass": false,
"agent_group": "hermes",
"m1_id": "bf433251-c3c3-4efc-8576-62dc38d0a99c",
"promote_http_code": "200",
"bob_sees_tier": "(missing)",
"reasons": [
"bob sees tier=\"(missing)\", expected \"long\""
]
}
scenario-16.log (console trace)
[scenario-16 hermes] alice writes M1 tier=short on node-1 [scenario-16 hermes] M1 id=bf433251-c3c3-4efc-8576-62dc38d0a99c [scenario-16 hermes] alice promotes M1 to tier=long [scenario-16 hermes] promote returned HTTP 200 [scenario-16 hermes] bob reads M1 on node-2 and checks tier [scenario-16 hermes] bob sees tier=(missing) (expected long)
Scenario 17 — Stats consistency FAIL
Reasons: node-1 count=5 != expected 15 | node-2 count=5 != expected 15 | node-3 count=5 != expected 15 | node-4 count=5 != expected 15
scenario-17.json (report)
{
"scenario": "17",
"pass": false,
"agent_group": "hermes",
"expected_count": 15,
"per_peer": {
"node_1": 5,
"node_2": 5,
"node_3": 5,
"node_4": 5
},
"reasons": [
"node-1 count=5 != expected 15",
"node-2 count=5 != expected 15",
"node-3 count=5 != expected 15",
"node-4 count=5 != expected 15"
]
}
scenario-17.log (console trace)
[scenario-17 hermes] phase A: each of 3 agents writes 5 memories to scenario17-stats [scenario-17 hermes] ai:alice on 157.245.219.205 [scenario-17 hermes] ai:bob on 167.172.234.175 [scenario-17 hermes] ai:charlie on 68.183.145.110 [scenario-17 hermes] settle 15s for W=2 fanout [scenario-17 hermes] phase B: querying count on every peer [scenario-17 hermes] node-1 count=5 (expected 15) [scenario-17 hermes] node-2 count=5 (expected 15) [scenario-17 hermes] node-3 count=5 (expected 15) [scenario-17 hermes] node-4 count=5 (expected 15)
Scenario 18 — Semantic query expansion FAIL
Reasons: semantic query did not surface alice's memory | semantic query did not surface bob's memory
scenario-18.json (report)
{
"scenario": "18",
"pass": false,
"agent_group": "hermes",
"query": "morning outdoor exercise routine",
"writers": [
{
"agent": "ai:alice",
"marker": "alice-sunrise-7d4b72be-017f-42d4-aff9-7f58975394d0",
"seen_by_charlie": 0
},
{
"agent": "ai:bob",
"marker": "bob-daybreak-c918391a-8539-45ad-b5ca-0fa13d972838",
"seen_by_charlie": 0
}
],
"reasons": [
"semantic query did not surface alice's memory",
"semantic query did not surface bob's memory"
]
}
scenario-18.log (console trace)
[scenario-18 hermes] alice writes A on node-1 [scenario-18 hermes] bob writes B on node-2 [scenario-18 hermes] settle 15s for fanout + index rebuild [scenario-18 hermes] charlie queries on node-3 with semantically-related prompt [scenario-18 hermes] charlie sees alice's memory: 0 (expected >=1) [scenario-18 hermes] charlie sees bob's memory: 0 (expected >=1)
All artifacts
- a2a-baseline.json
- a2a-summary.json
- campaign.meta.json
- f3-peer-a2a.json
- scenario-1.json
- scenario-10.json
- scenario-11.json
- scenario-12.json
- scenario-13.json
- scenario-14.json
- scenario-15.json
- scenario-16.json
- scenario-17.json
- scenario-18.json
- scenario-1b.json
- scenario-2.json
- scenario-4.json
- scenario-5.json
- scenario-6.json
- scenario-9.json
- scenario-1.log
- scenario-10.log
- scenario-11.log
- scenario-12.log
- scenario-13.log
- scenario-14.log
- scenario-15.log
- scenario-16.log
- scenario-17.log
- scenario-18.log
- scenario-1b.log
- scenario-2.log
- scenario-4.log
- scenario-5.log
- scenario-6.log
- scenario-9.log