Fixed the "Query key condition not supported" error in KV Store operations by restructuring the DynamoDB table schema from a single partition key to a composite key structure.
The original implementation used an invalid DynamoDB KeyConditionExpression:
KeyConditionExpression: 'begins_with(pk, :prefix)'
This syntax is not supported in DynamoDB Query operations. The begins_with()
function can only be used in FilterExpression or on sort keys, not directly on partition keys.
Restructured the data model to use a composite key:
- Partition Key (pk): Contains the user namespace (e.g., "user123", "demo")
- Sort Key (sk): Contains the user's key (e.g., "settings", "config-app")
- Interface Change: Modified
NamespacedKVItem
to usepk
andsk
instead of combined key - Key Generation: Replaced
namespacedKey()
withgetKeys()
returning{pk, sk}
- Query Operations: Fixed KeyConditionExpression syntax:
// List operation KeyConditionExpression: 'pk = :pk AND begins_with(sk, :prefix)' // Query operation KeyConditionExpression: 'pk = :pk AND begins_with(sk, :pattern)'
- README.md: Added DynamoDB table schema requirements
- MIGRATION.md: Created comprehensive migration guide
- AWS Permissions: Updated to remove unnecessary Scan permission
aws dynamodb create-table \ --table-name your-table-name \ --attribute-definitions \ AttributeName=pk,AttributeType=S \ AttributeName=sk,AttributeType=S \ --key-schema \ AttributeName=pk,KeyType=HASH \ AttributeName=sk,KeyType=RANGE \ --billing-mode PAY_PER_REQUEST
Before (Broken):
{ "pk": "user123:settings", "data": {"theme": "dark"}, "created": "2024-01-01T00:00:00Z" }
After (Fixed):
{ "pk": "user123", "sk": "settings", "data": {"theme": "dark"}, "created": "2024-01-01T00:00:00Z" }
- Correct DynamoDB Usage: Uses proper KeyConditionExpression syntax
- Efficient Queries: Leverages DynamoDB's composite key capabilities
- Better Performance: Query operations instead of Scan
- Namespace Isolation: Each user's data is efficiently partitioned
- Prefix Matching: Supports efficient prefix queries within namespaces
⚠️ BREAKING CHANGE: Existing users must migrate their DynamoDB table schema. See MIGRATION.md
for detailed instructions.
- Created
test-kv-fix.ts
to verify the fix - Confirmed proper KeyConditionExpression structure
- Validated method signatures and error handling
Updated the code-exec
tool to accept either direct code strings or S3 object keys containing code, providing more flexibility for code execution workflows.
Before:
- Only accepted
code
parameter (direct code string) - Required both
code
andinput
parameters
After:
- Accepts either
code
(direct string) ORkey
(S3 object key) - Validates that only one of
code
orkey
is provided - Fetches code from S3 when
key
is specified - Improved error handling and logging
Before:
{ "name": "code-exec", "description": "Execute code javscript/ts (must define an async execute function which will recive input, and tools params)", "inputSchema": { "type": "object", "properties": { "code": { "type": "string", "description": "The code to execute" }, "input": { "type": "object", "description": "Input parameters for the skill" } }, "required": ["code", "input"] } }
After:
{ "name": "code-exec", "description": "Execute JavaScript/TypeScript code. The code must define an async execute function which will receive input and tools params. You can provide code either directly as a string or reference an S3 object key containing the code.", "inputSchema": { "type": "object", "properties": { "code": { "type": "string", "description": "The JavaScript/TypeScript code to execute directly (mutually exclusive with 'key')" }, "key": { "type": "string", "description": "S3 object key containing the code to execute (mutually exclusive with 'code')" }, "input": { "type": "object", "description": "Input parameters for the skill execution" } }, "required": ["input"], "oneOf": [ { "required": ["code", "input"] }, { "required": ["key", "input"] } ] } }
- Added comprehensive documentation in README.md
- Included examples for both execution modes
- Added security considerations
- Created example S3 script demonstrating best practices
example-s3-script.js
: Comprehensive example showing how to create reusable code modulestest-code-exec.ts
: Test suite for the new functionalitytest-simple.ts
: Simple validation test
curl -X POST https://your-server/mcp \ -H "Content-Type: application/json" \ -H "Authorization: Bearer your-aws-secret" \ -d '{ "jsonrpc": "2.0", "method": "tools/call", "params": { "name": "code-exec", "arguments": { "code": "async function execute(input, tools) { return {message: \"Hello\", input}; }", "input": {"userId": "user123"} } }, "id": 1 }'
curl -X POST https://your-server/mcp \ -H "Content-Type: application/json" \ -H "Authorization: Bearer your-aws-secret" \ -d '{ "jsonrpc": "2.0", "method": "tools/call", "params": { "name": "code-exec", "arguments": { "key": "scripts/data-processor.js", "input": {"userId": "user123", "operation": "analyze"} } }, "id": 1 }'
- Reusability: Store commonly used scripts in S3 for reuse across multiple executions
- Version Control: Manage script versions by updating S3 objects
- Collaboration: Share scripts across different users and systems
- Maintainability: Separate complex logic into dedicated script files
- Flexibility: Choose between inline code for simple operations or S3 scripts for complex workflows
- Ensures mutual exclusivity between
code
andkey
parameters - Validates that at least one code source is provided
- Proper error handling for S3 fetch failures
- Maintains backward compatibility with existing direct code usage
- Same sandboxed execution environment
- No additional security risks introduced
- S3 access uses existing AWS authentication
- Code execution remains restricted to available MCP tools