gcp-developer
npx skills add https://github.com/serendipityoneinc/srp-claude-code-marketplace --skill gcp-developer
Agent 安装分布
Skill 文档
GCP Developer Access (GCP å¼åè 访é®)
为å¼åè æä¾ GCP èµæºçè®¿é®æéï¼å æ¬ BigQuery æ°æ®æ¥è¯¢ååæãGCS 对象æ¥çãGKE é群信æ¯çã
Provides GCP resource access for developers, including BigQuery data queries and analysis, GCS object viewing, and GKE cluster information.
â ï¸ Important: This skill provides READ-ONLY access. No management operations (create, update, delete) are allowed.
Quick Start
Query BigQuery (æ¥è¯¢ BigQuery)
æ¥è¯¢ BigQuery 表 `srpproduct-dc37e.dataset.table` çå 10 è¡æ°æ®
Query the first 10 rows from BigQuery table `srpproduct-dc37e.dataset.table`
List BigQuery Tables (ååº BigQuery 表)
æ¾ç¤º srpproduct-dc37e 项ç®ä¸çææè¡¨
List all tables in srpproduct-dc37e project
Get Table Schema (è·åè¡¨ç»æ)
æ¾ç¤ºè¡¨ dataset.table çåæ®µç»æ
Show schema of table dataset.table
Key Features
1. BigQuery Data Access (BigQuery æ°æ®è®¿é®)
Execute read-only SQL queries against BigQuery:
Available MCP Tools:
mcp__bigquery__execute-query– Execute SELECT queriesmcp__bigquery__list-tables– List all tablesmcp__bigquery__describe-table– Get table schema
Query Capabilities:
- Execute SELECT queries
- Join multiple tables
- Aggregate data (GROUP BY, COUNT, SUM, AVG)
- Filter and sort results
- Use subqueries and CTEs
- Restrictions: No INSERT, UPDATE, DELETE, DROP operations
Query Parameters:
query: SQL SELECT statement (BigQuery SQL dialect)- Results are returned as JSON
- Maximum query execution time: varies by project settings
2. Table Management (è¡¨ç®¡ç – åªè¯»)
View table information and schema:
List Tables:
- Get all tables across datasets
- View table names and dataset hierarchy
- No pagination required
Describe Table:
- Get complete schema information
- View column names and data types
- See table metadata
Common Workflows
Workflow 1: Explore Database
1. User: "æ¾ç¤ºææå¯ç¨ç BigQuery 表"
2. List tables: mcp__bigquery__list-tables
3. Display tables grouped by dataset
4. User selects a table to explore
5. Get schema: mcp__bigquery__describe-table
- table_name: "dataset.table"
6. Display column information
Workflow 2: Query Data
1. User: "æ¥è¯¢ç¨æ·æ´»è·æ°æ®"
2. Understand requirements
3. Execute query: mcp__bigquery__execute-query
- query: "SELECT user_id, COUNT(*) as events FROM dataset.events WHERE date >= '2026-01-01' GROUP BY user_id ORDER BY events DESC LIMIT 10"
4. Display results in table format
5. Provide insights or summary
Workflow 3: Data Analysis
1. User: "åææè¿ä¸å¨çé宿°æ®"
2. Query relevant tables
3. Aggregate and analyze data
4. Present findings with visualizations (text-based)
5. Suggest follow-up queries if needed
Examples (示ä¾)
Example 1: List Available Tables (ååºå¯ç¨è¡¨)
User (Chinese):
æ¾ç¤º srpproduct-dc37e 项ç®ä¸çææ BigQuery 表
Assistant Response:
æä¼ååº srpproduct-dc37e 项ç®ä¸çææè¡¨ã
<uses mcp__bigquery__list-tables>
æ¾å°ä»¥ä¸æ°æ®éå表:
**dataset: analytics**
- analytics.user_events
- analytics.page_views
- analytics.conversions
**dataset: product**
- product.users
- product.orders
- product.inventory
**dataset: logs**
- logs.application_logs
- logs.error_logs
- logs.access_logs
å
± 9 个表ãéè¦æ¥çæä¸ªè¡¨ç详ç»ç»æå?
Example 2: Get Table Schema (è·åè¡¨ç»æ)
User (English):
Show me the schema of analytics.user_events table
Assistant Response:
I'll get the schema for the analytics.user_events table.
<uses mcp__bigquery__describe-table>
- table_name: "analytics.user_events"
**Table: analytics.user_events**
| Column Name | Data Type | Mode | Description |
|-------------|-----------|------|-------------|
| event_id | STRING | REQUIRED | Unique event identifier |
| user_id | STRING | REQUIRED | User identifier |
| event_type | STRING | REQUIRED | Type of event |
| event_timestamp | TIMESTAMP | REQUIRED | When event occurred |
| properties | JSON | NULLABLE | Event properties |
| session_id | STRING | NULLABLE | Session identifier |
| device_type | STRING | NULLABLE | Device category |
**Total:** 7 columns
Would you like to query some data from this table?
Example 3: Query Data (æ¥è¯¢æ°æ®)
User (Chinese):
æ¥è¯¢æ¨å¤©æ´»è·ç¨æ·æ°éæå¤çå 10 个äºä»¶ç±»å
Assistant Response:
æä¼æ¥è¯¢æ¨å¤©åäºä»¶ç±»åçæ´»è·ç¨æ·æ°ã
<uses mcp__bigquery__execute-query>
- query:
```sql
SELECT
event_type,
COUNT(DISTINCT user_id) as active_users
FROM
analytics.user_events
WHERE
DATE(event_timestamp) = CURRENT_DATE() - 1
GROUP BY
event_type
ORDER BY
active_users DESC
LIMIT 10
æ¥è¯¢ç»æ – æ¨å¤©ææ´»è·çäºä»¶ç±»å:
| äºä»¶ç±»å | æ´»è·ç¨æ·æ° |
|---|---|
| page_view | 15,234 |
| button_click | 8,956 |
| form_submit | 4,123 |
| video_play | 3,890 |
| search | 3,456 |
| download | 2,345 |
| share | 1,987 |
| comment | 1,654 |
| like | 1,432 |
| bookmark | 1,098 |
æ»è®¡: 44,175 æ´»è·ç¨æ·
主è¦åç°:
- 页颿µè§æ¯æå¸¸è§çäºä»¶ï¼å æ»æ´»è·ç¨æ·ç 34.3%
- 交äºç±»äºä»¶ï¼ç¹å»ãæäº¤ï¼ä¹æè¾é«çåä¸åº¦
- 社交类äºä»¶ï¼å享ãè¯è®ºï¼åä¸åº¦ç¸å¯¹è¾ä½
éè¦æè¿ä¸æ¥åææä¸ªç¹å®äºä»¶ç±»åå?
### Example 4: Complex Analytics Query (夿忿¥è¯¢)
**User (English):**
Analyze user retention: how many users from last week are still active this week?
**Assistant Response:**
```markdown
I'll analyze user retention by comparing last week and this week's active users.
<uses mcp__bigquery__execute-query>
- query:
```sql
WITH last_week_users AS (
SELECT DISTINCT user_id
FROM analytics.user_events
WHERE event_timestamp BETWEEN TIMESTAMP_SUB(CURRENT_TIMESTAMP(), INTERVAL 14 DAY)
AND TIMESTAMP_SUB(CURRENT_TIMESTAMP(), INTERVAL 7 DAY)
),
this_week_users AS (
SELECT DISTINCT user_id
FROM analytics.user_events
WHERE event_timestamp >= TIMESTAMP_SUB(CURRENT_TIMESTAMP(), INTERVAL 7 DAY)
)
SELECT
COUNT(DISTINCT lw.user_id) as last_week_users,
COUNT(DISTINCT tw.user_id) as this_week_users,
COUNT(DISTINCT CASE WHEN tw.user_id IS NOT NULL THEN lw.user_id END) as retained_users,
ROUND(COUNT(DISTINCT CASE WHEN tw.user_id IS NOT NULL THEN lw.user_id END) * 100.0 / COUNT(DISTINCT lw.user_id), 2) as retention_rate
FROM last_week_users lw
LEFT JOIN this_week_users tw ON lw.user_id = tw.user_id
User Retention Analysis:
| Metric | Value |
|---|---|
| Last Week Active Users | 45,678 |
| This Week Active Users | 48,234 |
| Retained Users | 32,456 |
| Retention Rate | 71.06% |
Insights:
- Strong retention rate of 71.06%
- 13,222 users from last week did not return (28.94% churn)
- 15,778 new users joined this week
- Net growth: +2,556 users week-over-week (+5.6%)
Recommendations:
- Focus on re-engaging the 13K churned users
- Analyze what features retained users engaged with
- Continue current growth strategies
## Important Notes
### Read-Only Restrictions (åªè¯»éå¶)
**Allowed Operations:**
â
SELECT queries
â
View table schemas
â
List tables and datasets
â
Aggregate and analyze data
â
Join tables
â
Use subqueries and CTEs
**Prohibited Operations:**
â INSERT, UPDATE, DELETE
â DROP or CREATE tables
â ALTER table schemas
â TRUNCATE tables
â MERGE statements
â DML operations
### Query Best Practices (æ¥è¯¢æä½³å®è·µ)
1. **Use LIMIT**: Always limit results for exploratory queries
2. **Filter Early**: Use WHERE clauses to reduce data scanned
3. **Avoid SELECT ***: Specify only needed columns
4. **Use Partitions**: Filter by partition columns (usually dates)
5. **Check Costs**: Be mindful of data processed
6. **Use CTEs**: Make complex queries more readable
### BigQuery SQL Dialect (BigQuery SQL æ¹è¨)
BigQuery uses Standard SQL with some specific functions:
- Date functions: `CURRENT_DATE()`, `DATE_SUB()`, `TIMESTAMP_TRUNC()`
- String functions: `STRING_AGG()`, `SPLIT()`, `REGEXP_EXTRACT()`
- Array operations: `UNNEST()`, `ARRAY_AGG()`
- Window functions: `ROW_NUMBER()`, `LAG()`, `LEAD()`
### Project and Dataset Access (项ç®åæ°æ®é访é®)
- Default project: `srpproduct-dc37e`
- Default location: `us-east1`
- Access depends on user's GCP IAM permissions
- Some datasets may be restricted
## Error Handling
Common errors and solutions:
1. **"Table not found" (è¡¨æªæ¾å°)**
- Check table name spelling
- Verify dataset exists
- Ensure proper format: `dataset.table`
2. **"Permission denied" (æé被æç»)**
- User lacks BigQuery read permissions
- Contact GCP admin to grant access
- Check IAM roles
3. **"Syntax error" (è¯æ³é误)**
- Review SQL syntax
- Use BigQuery Standard SQL
- Check for typos in column names
4. **"Query timeout" (æ¥è¯¢è¶
æ¶)**
- Query is too complex or scans too much data
- Add filters to reduce data scanned
- Break into smaller queries
5. **"Quota exceeded" (è¶
åºé
é¢)**
- Too many concurrent queries
- Wait and retry
- Contact admin to increase quota
## Security & Compliance
### Data Access Policy (æ°æ®è®¿é®çç¥)
- Only read access granted
- No data modification allowed
- All queries are logged
- Follow company data policies
### Sensitive Data (æææ°æ®)
- Do not query PII without authorization
- Respect data classification levels
- Use aggregated data when possible
- Do not share raw sensitive data
### Best Practices (æä½³å®è·µ)
- Use for development and analytics only
- Do not use production queries on large tables without testing
- Always add LIMIT for exploratory queries
- Be mindful of query costs
## Prerequisites
### Environment Variables
Set up GCP project configuration:
```bash
export GCP_PROJECT_ID="srpproduct-dc37e"
export GCP_LOCATION="us-east1"
GCP Authentication
Ensure you have authenticated with GCP:
gcloud auth application-default login
Or use a service account key:
export GOOGLE_APPLICATION_CREDENTIALS="/path/to/service-account-key.json"
Required Permissions
Minimum IAM roles required:
roles/bigquery.dataViewer– View table dataroles/bigquery.jobUser– Execute queriesroles/bigquery.metadataViewer– View table metadata
Tips for Effective Use
- Start with schema: Always check table schema before querying
- Use LIMIT: Test queries with LIMIT 10 first
- Aggregate smartly: Use GROUP BY for summarization
- Join efficiently: Filter before joining large tables
- Save common queries: Document frequently used queries
- Use date partitions: Always filter by date for partitioned tables
- Check costs: Monitor bytes processed in query results
Related Skills
github-integration: GitHub code and PR management- Future:
gcs-readonly,gke-readonly,dataflow-readonly
Limitations
Current Limitations
- BigQuery only (GCS and GKE not yet implemented)
- No query history or saved queries
- No data export capabilities
- No query optimization suggestions
- No automatic error recovery
Future Enhancements
- GCS bucket and object listing
- GCS file content preview
- GKE cluster and pod information (read-only)
- Cloud SQL read-only access
- Query performance analytics
- Saved query templates
- Data visualization helpers
GCS and GKE (Coming Soon)
GCS Read-Only (计åä¸)
- List buckets
- List objects in buckets
- View object metadata
- Download small files for viewing
- Search for objects
GKE Read-Only (计åä¸)
- List clusters
- View cluster configuration
- List namespaces
- View pod status
- Read pod logs
- View service configurations
Note: These features require additional MCP server configurations and will be added in future versions.