BinAssist-MCP
BinAssistは、データ処理の自動化を目的としたツールで、特にバイナリデータの解析に特化しています。ユーザーは、複雑なデータセットを簡単に扱うことができ、効率的なワークフローを実現します。機能は多岐にわたり、特にデータの整形やフィルタリングが強化されています。
GitHubスター
6
ユーザー評価
未評価
お気に入り
0
閲覧数
33
フォーク
0
イシュー
0
BinAssistMCP
Comprehensive Model Context Protocol (MCP) server for Binary Ninja with AI-powered reverse engineering capabilities
Summary
BinAssistMCP is a powerful bridge between Binary Ninja and Large Language Models (LLMs) like Claude, providing comprehensive reverse engineering tools through the Model Context Protocol (MCP). It enables AI-assisted binary analysis by exposing Binary Ninja's advanced capabilities through both Server-Sent Events (SSE) and STDIO transports.
Key Features
- Dual Transport Support: Both SSE (web-based) and STDIO (command-line) transports
- 40+ Analysis Tools: Complete Binary Ninja API wrapper with advanced functionality
- Multi-Binary Sessions: Concurrent analysis of multiple binaries with intelligent context management
- Smart Symbol Management: Advanced function searching, renaming, and type management
- Auto-Integration: Seamless Binary Ninja plugin with automatic startup capabilities
- Flexible Configuration: Comprehensive settings management through Binary Ninja's interface
- AI-Ready: Optimized for LLM integration with structured tool responses
Use Cases
- AI-Assisted Reverse Engineering: Leverage LLMs for intelligent code analysis and documentation
- Automated Binary Analysis: Script complex analysis workflows with natural language
- Research and Education: Teach reverse engineering concepts with AI guidance
- Security Analysis: Accelerate vulnerability research and malware analysis
- Code Understanding: Generate comprehensive documentation and explanations
Tool Details
BinAssistMCP provides over 40 specialized tools organized into functional categories:
Binary Management
list_binaries
- List all loaded binary filesget_binary_status
- Check analysis status and metadataupdate_analysis_and_wait
- Force analysis update and wait for completion
Code Analysis & Decompilation
decompile_function
- Generate high-level decompiled codeget_function_pseudo_c
- Extract pseudo-C representationget_function_high_level_il
- Access High-Level Intermediate Languageget_function_medium_level_il
- Access Medium-Level Intermediate Languageget_disassembly
- Retrieve assembly code with annotations
Information Retrieval
get_functions
- List all functions with metadatasearch_functions_by_name
- Find functions by name patternsget_functions_advanced
- Advanced filtering (size, complexity, parameters)search_functions_advanced
- Multi-target searching (name, comments, calls, variables)get_function_statistics
- Comprehensive binary statisticsget_imports
- Import table analysis grouped by moduleget_exports
- Export table with symbol informationget_strings
- String extraction with contextget_segments
- Memory layout analysisget_sections
- Binary section information
Symbol & Naming Management
rename_symbol
- Rename functions and data variablesget_cross_references
- Find all references to/from symbolsanalyze_function
- Comprehensive function analysisget_call_graph
- Call relationship mapping
Documentation & Comments
set_comment
- Add comments to specific addressesget_comment
- Retrieve comments at addressesget_all_comments
- Export all comments with contextremove_comment
- Delete existing commentsset_function_comment
- Add function-level documentation
Variable Management
create_variable
- Define local variables in functionsget_variables
- List function parameters and localsrename_variable
- Rename variables for clarityset_variable_type
- Update variable type information
Type System Management
create_type
- Define custom types and structuresget_types
- List all user-defined typescreate_enum
- Create enumeration typescreate_typedef
- Create type aliasesget_type_info
- Detailed type informationget_classes
- List classes and structurescreate_class
- Define new classes/structuresadd_class_member
- Add members to existing types
Data Analysis
create_data_var
- Define data variables at addressesget_data_vars
- List all defined data variablesget_data_at_address
- Analyze raw data with type inference
Navigation & Context
get_current_address
- Get current cursor positionget_current_function
- Identify function at current addressget_namespaces
- Namespace and symbol organization
Advanced Analysis
get_triage_summary
- Complete binary overviewget_function_statistics
- Statistical analysis of all functions
Each tool is designed for seamless integration with AI workflows, providing structured responses that LLMs can easily interpret and act upon.
Installation
Prerequisites
- Binary Ninja: Version 4000 or higher
- Python: 3.8+ (typically bundled with Binary Ninja)
- Platform: Windows, macOS, or Linux
Option 1: Binary Ninja Plugin Manager (Recommended)
- Open Binary Ninja
- Navigate to Tools → Manage Plugins
- Search for "BinAssistMCP"
- Click Install
- Restart Binary Ninja
Option 2: Manual Installation
Step 1: Download and Extract
git clone https://github.com/jtang613/BinAssistMCP.git
cd BinAssistMCP
Step 2: Install Dependencies
# Install Python dependencies
pip install -r requirements.txt
# Or install individually:
pip install anyio>=4.0.0 hypercorn>=0.16.0 mcp>=1.0.0 trio>=0.27.0 pydantic>=2.0.0 pydantic-settings>=2.0.0 click>=8.0.0
Step 3: Copy to Plugin Directory
Windows:
copy BinAssistMCP "%APPDATA%\Binary Ninja\plugins\"
macOS:
cp -r BinAssistMCP ~/Library/Application\ Support/Binary\ Ninja/plugins/
Linux:
cp -r BinAssistMCP ~/.binaryninja/plugins/
Step 4: Verify Installation
- Restart Binary Ninja
- Open any binary file
- Check Tools menu for "BinAssistMCP" submenu
- Look for startup messages in the log panel
Configuration
Basic Setup
- Open Binary Ninja Settings (Edit → Preferences)
- Navigate to the binassistmcp section
- Configure server settings:
- Host:
localhost
(default) - Port:
9090
(default) - Transport:
both
(SSE + STDIO)
- Host:
Advanced Configuration
# Environment variables (optional)
export BINASSISTMCP_SERVER__HOST=localhost
export BINASSISTMCP_SERVER__PORT=9090
export BINASSISTMCP_SERVER__TRANSPORT=both
export BINASSISTMCP_BINARY__MAX_BINARIES=10
Usage
Starting the Server
Via Binary Ninja Menu:
- Tools → BinAssistMCP → Start Server
- Check log panel for startup confirmation
- Note the server URL (e.g.,
http://localhost:9090
)
Auto-Startup (Default):
- Server starts automatically when Binary Ninja loads a file
- Configurable via settings:
binassistmcp.plugin.auto_startup
Connecting with Claude Desktop
Add to your Claude Desktop MCP configuration:
{
"mcpServers": {
"binassist": {
"command": "python",
"args": ["/path/to/BinAssistMCP"],
"env": {
"BINASSISTMCP_SERVER__TRANSPORT": "stdio"
}
}
}
}
Using with SSE Transport
Connect web-based MCP clients to:
http://localhost:9090/sse
Integration Examples
Basic Function Analysis
Ask Claude: "Analyze the main function in the loaded binary and explain what it does"
Claude will use tools like:
- get_functions() to find main
- decompile_function() to get readable code
- get_function_pseudo_c() for C representation
- analyze_function() for comprehensive analysis
Vulnerability Research
Ask Claude: "Find all functions that handle user input and check for buffer overflows"
Claude will use:
- search_functions_advanced() to find input handlers
- get_cross_references() to trace data flow
- get_variables() to analyze buffer usage
- set_comment() to document findings
Troubleshooting
Common Issues
Server won't start:
- Check Binary Ninja log panel for error messages
- Verify all dependencies are installed
- Ensure port 9090 is not in use
Binary Ninja crashes:
- Check Python environment compatibility
- Try reducing
max_binaries
setting - Restart with a single binary file
Tools return errors:
- Ensure binary analysis is complete
- Check if Binary Ninja file is still open
- Verify function/address exists
Support
- Issues: Report bugs on GitHub Issues
- Binary Ninja: Check official Binary Ninja documentation
Contributing
- Fork the repository
- Create a feature branch
- Make your changes following the existing code style
- Test with multiple binary types
- Submit a pull request
License
This project is licensed under the MIT License - see the LICENSE file for details.