loading…
Search for a command to run...
loading…
Production-grade Model Context Protocol server for Google NotebookLM that lets any MCP-capable client (Claude, ChatGPT, Cursor, etc.) interact with NotebookLM n
Production-grade Model Context Protocol server for Google NotebookLM that lets any MCP-capable client (Claude, ChatGPT, Cursor, etc.) interact with NotebookLM notebooks, sources, chats, and artifacts.
Production-grade Model Context Protocol server for Google NotebookLM
CI PyPI version Python 3.11+ License: MIT Coverage Code style: ruff
Connect any MCP-capable client to Google NotebookLM. Works with Claude Desktop, Claude.ai, ChatGPT, Cursor, VS Code Continue, and any client that speaks MCP or OpenAPI.
/.well-known/ai-plugin.json.search.fetch.Google NotebookLM is useful for research notebooks, source-grounded chat, study material, and artifact generation.
MCP clients need a stable programmatic bridge.
notebooklm-mcp-pro provides that bridge.
It exposes NotebookLM actions as MCP tools.
It exposes NotebookLM records as MCP resources.
It exposes workflow starters as MCP prompts.
It also exposes an OpenAPI action surface for clients that integrate through HTTP schemas.
uv tool install notebooklm-mcp-pro
nlm-mcp --version
python -m pip install --upgrade notebooklm-mcp-pro
nlm-mcp --version
pipx install notebooklm-mcp-pro
nlm-mcp --version
python -m pip install "notebooklm-mcp-pro[all]"
git clone https://github.com/oaslananka/notebooklm-mcp-pro
cd notebooklm-mcp-pro
make bootstrap
make test
Run the NotebookLM browser login once:
notebooklm-py login
The default auth file is:
~/.config/nlm-mcp/notebooklm_auth.json
Override it with:
export NLM_MCP_NOTEBOOKLM_AUTH_FILE=/secure/path/notebooklm_auth.json
For containers:
export NLM_MCP_NOTEBOOKLM_AUTH_JSON='{"cookies":[],"origins":[]}'
Treat this JSON as a secret.
pip install notebooklm-mcp-pro
notebooklm-py login
nlm-mcp stdio
Use this mode for local desktop clients.
It does not add an HTTP auth layer.
The caller process controls access.
export NLM_MCP_TRANSPORT=http
export NLM_MCP_AUTH_MODE=token
export NLM_MCP_BEARER_TOKEN="$(python -c 'import secrets; print(secrets.token_urlsafe(32))')"
export NLM_MCP_BASE_URL=https://your-server.example.com
nlm-mcp serve --host 0.0.0.0 --port 8080
Test:
curl https://your-server.example.com/healthz
curl -H "Authorization: Bearer $NLM_MCP_BEARER_TOKEN" \
https://your-server.example.com/mcp
export NLM_MCP_TRANSPORT=http
export NLM_MCP_AUTH_MODE=github-oauth
export NLM_MCP_BASE_URL=https://your-server.example.com
export NLM_MCP_GITHUB_CLIENT_ID=your-client-id
export NLM_MCP_GITHUB_CLIENT_SECRET=your-client-secret
export NLM_MCP_OAUTH_ALLOWED_USERS=oaslananka
nlm-mcp serve --host 0.0.0.0 --port 8080
Users start at:
https://your-server.example.com/auth/login
Add this to the desktop config file.
macOS:
~/Library/Application Support/Claude/claude_desktop_config.json
Windows:
%APPDATA%\Claude\claude_desktop_config.json
Linux:
~/.config/Claude/claude_desktop_config.json
Config:
{
"mcpServers": {
"notebooklm": {
"command": "nlm-mcp",
"args": ["stdio"],
"env": {
"NLM_MCP_LOG_LEVEL": "WARNING"
}
}
}
}
With uvx:
{
"mcpServers": {
"notebooklm": {
"command": "uvx",
"args": ["notebooklm-mcp-pro", "stdio"]
}
}
}
Deploy the HTTP server with a public HTTPS URL.
Use:
https://your-server.example.com/mcp
Choose bearer token or OAuth based on server configuration.
Run admin.health to verify.
Deploy the HTTP server.
Import:
https://your-server.example.com/openapi.json
Set authentication to bearer token when NLM_MCP_AUTH_MODE=token.
The action endpoints are:
POST /tools/{tool_name}
The manifest is:
GET /.well-known/ai-plugin.json
Use the same local stdio config shape:
{
"mcpServers": {
"notebooklm": {
"command": "nlm-mcp",
"args": ["stdio"]
}
}
}
Use local stdio or remote HTTP depending on your Continue configuration.
Local command:
nlm-mcp stdio
Remote endpoint:
https://your-server.example.com/mcp
| Tool | Purpose | Safety |
|---|---|---|
notebook.list |
List notebooks | read-only |
notebook.create |
Create a notebook | mutating |
notebook.get |
Get notebook metadata | read-only |
notebook.rename |
Rename a notebook | idempotent |
notebook.delete |
Delete a notebook | destructive, confirmation required |
notebook.share_public |
Toggle public sharing | destructive, confirmation required when enabling |
notebook.share_invite |
Invite collaborator | mutating, confirmation required |
notebook.share_status |
Read sharing settings | read-only |
| Tool | Purpose | Safety |
|---|---|---|
source.add_url |
Add a web URL | mutating |
source.add_youtube |
Add a YouTube video | mutating |
source.add_file |
Upload a local file | mutating |
source.add_gdrive |
Add a Google Drive document | mutating |
source.add_text |
Add pasted text | mutating |
source.list |
List sources | read-only |
source.get |
Get source metadata | read-only |
source.get_fulltext |
Get indexed text | read-only |
source.refresh |
Re-index a source | idempotent |
source.wait |
Wait for indexing | read-only, blocking |
source.remove |
Remove a source | destructive, confirmation required |
| Tool | Purpose |
|---|---|
chat.ask |
Ask a one-shot question |
chat.query |
OpenAPI alias for asking |
chat.stream_query |
Stream-oriented alias returning a completed result |
chat.conversation_start |
Start or identify a conversation |
chat.continue |
Continue a conversation |
chat.history |
Read conversation history |
chat.save_to_notes |
Save content as a note |
chat.save_note |
Alias for note save |
chat.list_notes |
List notes |
| Tool | Purpose |
|---|---|
research.web_start |
Start web research |
research.drive_start |
Start Drive research |
research.status |
Poll research status |
research.wait |
Wait for research and optionally import sources |
| Tool | Output |
|---|---|
generate.audio_overview |
Audio overview |
generate.video_overview |
Video overview |
generate.cinematic_video |
Cinematic video |
generate.slide_deck |
Slide deck |
generate.infographic |
Infographic |
generate.quiz |
Quiz |
generate.flashcards |
Flashcards |
generate.report |
Report |
generate.data_table |
Data table |
generate.mind_map |
Mind map |
| Tool | Purpose |
|---|---|
artifact.list |
List artifacts and tracked tasks |
artifact.status |
Poll task status |
artifact.wait |
Wait for task completion |
artifact.download |
Download an artifact |
artifact.delete |
Delete an artifact when supported |
artifact.cancel |
Cancel a task when supported |
artifact.revise_slide |
Revise one slide |
| Tool | Purpose |
|---|---|
language.list |
List supported languages |
language.get |
Read current output language |
language.set |
Set account-global output language |
| Tool | Purpose |
|---|---|
search |
Return matching record IDs |
fetch |
Return full record by ID |
| Tool | Purpose |
|---|---|
admin.health |
Server health |
admin.version |
Package and runtime version |
| Variable | Default | Description |
|---|---|---|
NLM_MCP_TRANSPORT |
stdio |
stdio or http |
NLM_MCP_HTTP_HOST |
0.0.0.0 |
HTTP bind host |
NLM_MCP_HTTP_PORT |
8080 |
HTTP bind port |
NLM_MCP_HTTP_PATH |
/mcp |
MCP endpoint path |
NLM_MCP_BASE_URL |
unset | Public URL |
NLM_MCP_AUTH_MODE |
none |
none, token, or github-oauth |
NLM_MCP_BEARER_TOKEN |
unset | Token auth secret |
NLM_MCP_GITHUB_CLIENT_ID |
unset | OAuth client ID |
NLM_MCP_GITHUB_CLIENT_SECRET |
unset | OAuth client secret |
NLM_MCP_OAUTH_ALLOWED_USERS |
unset | GitHub username allowlist |
NLM_MCP_NOTEBOOKLM_AUTH_FILE |
~/.config/nlm-mcp/notebooklm_auth.json |
NotebookLM auth file |
NLM_MCP_NOTEBOOKLM_AUTH_JSON |
unset | Inline NotebookLM auth JSON |
NLM_MCP_DATA_DIR |
~/.local/share/nlm-mcp |
Runtime data directory |
NLM_MCP_LOG_LEVEL |
INFO |
Log level |
NLM_MCP_LOG_FORMAT |
json |
json or console |
See Configuration for the full table.
docker build -f deploy/Dockerfile -t notebooklm-mcp-pro:dev .
docker run --rm -p 8080:8080 \
-e NLM_MCP_TRANSPORT=http \
-e NLM_MCP_AUTH_MODE=token \
-e NLM_MCP_BEARER_TOKEN=replace-with-generated-token \
-e NLM_MCP_BASE_URL=http://localhost:8080 \
notebooklm-mcp-pro:dev
docker compose -f deploy/docker-compose.yml up --build
docker pull ghcr.io/oaslananka/notebooklm-mcp-pro:latest
| Endpoint | Purpose | Auth |
|---|---|---|
GET /healthz |
health check | exempt |
GET /openapi.json |
OpenAPI schema | exempt |
GET /.well-known/ai-plugin.json |
plugin manifest | exempt |
GET /.well-known/oauth-protected-resource |
OAuth resource metadata | exempt |
GET /.well-known/oauth-authorization-server |
OAuth server metadata | exempt |
GET /auth/login |
GitHub OAuth login | exempt |
GET /auth/callback |
GitHub OAuth callback | exempt |
POST /tools/{tool_name} |
OpenAPI tool action | authenticated |
/mcp |
Streamable HTTP MCP endpoint | authenticated |
flowchart TB
Desktop["Desktop MCP client"] --> Stdio["stdio transport"]
Remote["Remote MCP/OpenAPI client"] --> HTTP["Streamable HTTP"]
HTTP --> Auth["Auth middleware"]
Stdio --> Server["FastMCP server"]
Auth --> Server
Server --> NotebookTools["Notebook tools"]
Server --> SourceTools["Source tools"]
Server --> ArtifactTools["Artifact tools"]
Server --> Resources["MCP resources"]
Server --> Prompts["MCP prompts"]
NotebookTools --> Backend["NotebookLMBackend"]
SourceTools --> Backend
ArtifactTools --> Backend
Backend --> NLM["notebooklm-py"]
NLM --> Google["Google NotebookLM"]
Server --> Store["SQLite task and OAuth store"]
NLM_MCP_BASE_URL on HTTPS for OAuth.See Security.
make bootstrap
make lint
make typecheck
make test
make docs
Generate the catalog:
make catalog
Run the HTTP server:
make run-http
Run the stdio server:
make run-stdio
Releases are cut from tags:
git tag v1.0.0
git push origin v1.0.0
The release workflow validates the tag, builds distributions, generates an SBOM, publishes to PyPI, pushes GHCR images, and creates a GitHub release.
Planned follow-up work:
See docs/ROADMAP.md.
Contributions are welcome when they are scoped, tested, and documented.
Read CONTRIBUTING.md.
Before opening a PR:
make lint
make typecheck
make test
make docs
Use Conventional Commits.
MIT License.
See LICENSE.
Выполни в терминале:
claude mcp add notebooklm-mcp-pro -- npx Web content fetching and conversion for efficient LLM usage.
Retrieval from AWS Knowledge Base using Bedrock Agent Runtime.
автор: modelcontextprotocolProvides auto-configuration for setting up an MCP server in Spring Boot applications.
A very streamlined mcp client that supports calling and monitoring stdio/sse/streamableHttp, and can also view request responses through the /logs page. It also
автор: xuzexin-hzНе уверен что выбрать?
Найди свой стек за 60 секунд
Автор?
Embed-бейдж для README
Похожее
Все в категории ai