Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 6 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,12 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).

## [0.21.0] - 2025-10-10
### Added
- Support for list and update method in data export
### Updated
- Data export methods to utilize enhanced endpoint and parameters

## [0.20.2] - 2025-10-06
### Updated
- Data table rows bulk replace larger rows handling.
Expand Down
32 changes: 32 additions & 0 deletions CLI.md
Original file line number Diff line number Diff line change
Expand Up @@ -657,11 +657,30 @@ secops export log-types --time-window 24
secops export log-types --page-size 50
```

List recent data exports:

```bash
# List all recent exports
secops export list

# List with pagination
secops export list --page-size 10
```

Create a data export:

```bash
# Export a single log type (legacy method)
secops export create --gcs-bucket "projects/my-project/buckets/my-bucket" --log-type "WINDOWS" --time-window 24

# Export multiple log types
secops export create --gcs-bucket "projects/my-project/buckets/my-bucket" --log-types "WINDOWS,LINUX,GCP_DNS" --time-window 24

# Export all log types
secops export create --gcs-bucket "projects/my-project/buckets/my-bucket" --all-logs --time-window 24

# Export with explicit start and end times
secops export create --gcs-bucket "projects/my-project/buckets/my-bucket" --all-logs --start-time "2025-01-01T00:00:00Z" --end-time "2025-01-02T00:00:00Z"
```

Check export status:
Expand All @@ -670,6 +689,19 @@ Check export status:
secops export status --id "export-123"
```

Update an export (only for exports in IN_QUEUE state):

```bash
# Update start time
secops export update --id "export-123" --start-time "2025-01-01T02:00:00Z"

# Update log types
secops export update --id "export-123" --log-types "WINDOWS,LINUX,AZURE"

# Update the GCS bucket
secops export update --id "export-123" --gcs-bucket "projects/my-project/buckets/my-new-bucket"
```

Cancel an export:

```bash
Expand Down
46 changes: 39 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -583,26 +583,56 @@ for log_type in available_log_types["available_log_types"]:
print(f"{log_type.display_name} ({log_type.log_type.split('/')[-1]})")
print(f" Available from {log_type.start_time} to {log_type.end_time}")

# Create a data export for a specific log type
# Create a data export for a single log type (legacy method)
export = chronicle.create_data_export(
gcs_bucket="projects/my-project/buckets/my-export-bucket",
start_time=start_time,
end_time=end_time,
log_type="GCP_DNS" # Specify log type to export
log_type="GCP_DNS" # Single log type to export
)

# Create a data export for multiple log types
export_multiple = chronicle.create_data_export(
gcs_bucket="projects/my-project/buckets/my-export-bucket",
start_time=start_time,
end_time=end_time,
log_types=["WINDOWS", "LINUX", "GCP_DNS"] # Multiple log types to export
)

# Get the export ID
export_id = export["name"].split("/")[-1]
print(f"Created export with ID: {export_id}")
print(f"Status: {export['data_export_status']['stage']}")
print(f"Status: {export['data_export_status']['stage']}")

# List recent exports
recent_exports = chronicle.list_data_export(page_size=10)
print(f"Found {len(recent_exports.get('dataExports', []))} recent exports")

# Print details of recent exports
for item in recent_exports.get("dataExports", []):
item_id = item["name"].split("/")[-1]
if "dataExportStatus" in item:
status = item["dataExportStatus"]["stage"]
else:
status = item["data_export_status"]["stage"]
print(f"Export ID: {item_id}, Status: {status}")

# Check export status
status = chronicle.get_data_export(export_id)
print(f"Export status: {status['data_export_status']['stage']}")
print(f"Progress: {status['data_export_status'].get('progress_percentage', 0)}%")

# Update an export that is in IN_QUEUE state
if status.get("dataExportStatus", {}).get("stage") == "IN_QUEUE":
# Update with a new start time
updated_start = start_time + timedelta(hours=2)
update_result = chronicle.update_data_export(
data_export_id=export_id,
start_time=updated_start,
# Optionally update other parameters like end_time, gcs_bucket, or log_types
)
print("Export updated successfully")

# Cancel an export if needed
if status['data_export_status']['stage'] in ['IN_QUEUE', 'PROCESSING']:
if status.get("dataExportStatus", {}).get("stage") in ["IN_QUEUE", "PROCESSING"]:
cancelled = chronicle.cancel_data_export(export_id)
print(f"Export has been cancelled. New status: {cancelled['data_export_status']['stage']}")

Expand All @@ -618,8 +648,10 @@ print(f"Created export for all logs. Status: {export_all['data_export_status']['
```

The Data Export API supports:
- Exporting one or all log types to Google Cloud Storage
- Exporting one, multiple, or all log types to Google Cloud Storage
- Listing recent exports and filtering results
- Checking export status and progress
- Updating exports that are in the queue
- Cancelling exports in progress
- Fetching available log types for a specific time range

Expand Down
2 changes: 2 additions & 0 deletions api_module_mapping.md
Original file line number Diff line number Diff line change
Expand Up @@ -115,6 +115,8 @@ Following shows mapping between SecOps [REST Resource](https://cloud.google.com/
|dataExports.create |v1alpha|chronicle.data_export.create_data_export |secops export create |
|dataExports.fetchavailablelogtypes |v1alpha|chronicle.data_export.fetch_available_log_types |secops export log-types |
|dataExports.get |v1alpha|chronicle.data_export.get_data_export |secops export status |
|dataExports.list |v1alpha|chronicle.data_export.list_data_export |secops export list |
|dataExports.patch |v1alpha|chronicle.data_export.update_data_export |secops export update |
|dataTableOperationErrors.get |v1alpha| | |
|dataTables.create |v1alpha|chronicle.data_table.create_data_table |secops data-table create |
|dataTables.dataTableRows.bulkCreate |v1alpha|chronicle.data_table.create_data_table_rows |secops data-table add-rows |
Expand Down
159 changes: 145 additions & 14 deletions examples/data_export_example.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,7 @@
import os
import sys
from datetime import datetime, timedelta, timezone
from time import sleep
from secops import SecOpsClient
from secops.exceptions import APIError

Expand All @@ -30,18 +31,47 @@ def parse_args():
parser.add_argument("--project_id", required=True, help="GCP project ID")
parser.add_argument("--customer_id", required=True, help="Chronicle customer ID")
parser.add_argument("--region", default="us", help="Chronicle region (default: us)")
parser.add_argument("--bucket", required=True, help="GCS bucket name for export")
parser.add_argument("--bucket", help="GCS bucket name for export")
parser.add_argument(
"--days", type=int, default=1, help="Number of days to look back (default: 1)"
)
parser.add_argument("--log_type", help="Optional specific log type to export")
parser.add_argument("--log_type", help="Single log type to export (deprecated)")
parser.add_argument(
"--log_types",
help="Comma-separated list of log types to export (e.g., WINDOWS,LINUX)"
)
parser.add_argument("--all_logs", action="store_true", help="Export all log types")
parser.add_argument(
"--list_only",
action="store_true",
help="Only list available log types, don't create export",
)
parser.add_argument("--credentials", help="Path to service account JSON key file")

# Additional options for demonstrating list/update functionality
parser.add_argument(
"--list_exports",
action="store_true",
help="List recent data exports"
)
parser.add_argument(
"--list_count",
type=int,
default=5,
help="Number of exports to list when using --list_exports"
)
parser.add_argument(
"--update",
help="Update an existing export with the given ID (must be in IN_QUEUE state)"
)
parser.add_argument(
"--new_bucket",
help="New bucket name when updating an export"
)
parser.add_argument(
"--new_log_types",
help="New comma-separated list of log types when updating an export"
)

return parser.parse_args()

Expand Down Expand Up @@ -70,7 +100,79 @@ def main():
)

try:
# Fetch available log types
# Check if we should just list exports
if args.list_exports:
print("\nListing recent data exports...")
list_result = chronicle.list_data_export(page_size=args.list_count)
exports = list_result.get("dataExports", [])
print(f"Found {len(exports)} exports")

for i, export_item in enumerate(exports, 1):
export_id = export_item["name"].split("/")[-1]
stage = export_item["dataExportStatus"]["stage"]
start = export_item.get("startTime", "N/A")
end = export_item.get("endTime", "N/A")

print(f"\n{i}. Export ID: {export_id}")
print(f" Status: {stage}")
print(f" Time range: {start} to {end}")
print(f" GCS Bucket: {export_item.get('gcsBucket', 'N/A')}")

# Get the log types
log_types = export_item.get("includeLogTypes", [])
if log_types:
log_type_names = [lt.split("/")[-1] for lt in log_types[:3]]
if len(log_types) <= 3:
print(f" Log types: {', '.join(log_type_names)}")
else:
print(f" Log types: {', '.join(log_type_names)} and {len(log_types) - 3} more")

if "nextPageToken" in list_result:
print(f"\nNext page token: {list_result['nextPageToken']}")

return 0

# Handle update command if specified
if args.update:
print(f"\nUpdating export ID: {args.update}")

# Get current status to verify it's in queue state
status = chronicle.get_data_export(args.update)
stage = status["dataExportStatus"]["stage"]

if stage != "IN_QUEUE":
print(f"Cannot update export: current status is {stage} but must be IN_QUEUE")
return 1

update_params = {"data_export_id": args.update}
updated = False

# Add GCS bucket if provided
if args.new_bucket:
new_gcs_bucket = f"projects/{args.project_id}/buckets/{args.new_bucket}"
update_params["gcs_bucket"] = new_gcs_bucket
print(f"Setting new GCS bucket: {new_gcs_bucket}")
updated = True

# Add log types if provided
if args.new_log_types:
new_log_types_list = [lt.strip() for lt in args.new_log_types.split(',')]
update_params["log_types"] = new_log_types_list
print(f"Setting new log types: {', '.join(new_log_types_list)}")
updated = True

if not updated:
print("No update parameters provided. Use --new_bucket or --new_log_types")
return 1

# Perform the update
result = chronicle.update_data_export(**update_params)
print("\nExport updated successfully!")
print(f"Status: {result['dataExportStatus']['stage']}")

return 0

# Fetch available log types for regular create flow
print("\nFetching available log types for export...")
result = chronicle.fetch_available_log_types(
start_time=start_time, end_time=end_time
Expand All @@ -94,14 +196,19 @@ def main():
return 0

# Validate export options
if args.all_logs and args.log_type:
print("Error: Cannot specify both --all_logs and --log_type")
option_count = sum([bool(args.all_logs), bool(args.log_type), bool(args.log_types)])

if option_count > 1:
print("Error: Can only specify one of: --all_logs, --log_type, or --log_types")
return 1

if not args.all_logs and not args.log_type:
print("Error: Must specify either --all_logs or --log_type")
if option_count == 0:
print("Error: Must specify one of: --all_logs, --log_type, or --log_types")
return 1

if not hasattr(args, "bucket") or not args.bucket:
print("Error: Must specify a GCS bucket name")
return 1
# Format GCS bucket path
gcs_bucket = f"projects/{args.project_id}/buckets/{args.bucket}"
print(f"\nExporting to GCS bucket: {gcs_bucket}")
Expand Down Expand Up @@ -130,6 +237,18 @@ def main():
end_time=end_time,
log_type=args.log_type,
)
elif args.log_types:
# Parse and validate comma-separated log types
log_types_list = [lt.strip() for lt in args.log_types.split(',')]
print(f"Creating data export for log types: {', '.join(log_types_list)}")

# Create export with multiple log types
export = chronicle.create_data_export(
gcs_bucket=gcs_bucket,
start_time=start_time,
end_time=end_time,
log_types=log_types_list,
)
else:
print("Creating data export for ALL log types")
export = chronicle.create_data_export(
Expand All @@ -143,15 +262,24 @@ def main():
export_id = export["name"].split("/")[-1]
print(f"\nExport created successfully!")
print(f"Export ID: {export_id}")
print(f"Status: {export['data_export_status']['stage']}")

if "dataExportStatus" in export:
print(f"Status: {export['dataExportStatus']['stage']}")
else:
print(f"Status: {export['data_export_status']['stage']}")

# Poll for status a few times to show progress
print("\nChecking export status:")

for i in range(3):
status = chronicle.get_data_export(export_id)
stage = status["data_export_status"]["stage"]
progress = status["data_export_status"].get("progress_percentage", 0)

if "dataExportStatus" in status:
stage = status["dataExportStatus"]["stage"]
progress = status["dataExportStatus"].get("progressPercentage", 0)
else:
stage = status["data_export_status"]["stage"]
progress = status["data_export_status"].get("progress_percentage", 0)

print(f" Status: {stage}, Progress: {progress}%")

Expand All @@ -160,12 +288,15 @@ def main():

if i < 2: # Don't wait after the last check
print(" Waiting 5 seconds...")
from time import sleep

sleep(5)

print("\nExport job is running. You can check its status later with:")
print("\nExport job is running. You can check its status or manage it with:")
print(f" # Check Status:")
print(f" python export_status.py --export_id {export_id} ...")
print(f" # List all exports:")
print(f" python data_export_example.py --project_id={args.project_id} --customer_id={args.customer_id} --list_exports")
print(f" \n # Update the export if still in queue:")
print(f" python data_export_example.py --project_id={args.project_id} --customer_id={args.customer_id} --bucket={args.bucket} --update={export_id} --new_log_types=WINDOWS,LINUX")

return 0

Expand Down
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ build-backend = "hatchling.build"

[project]
name = "secops"
version = "0.20.2"
version = "0.21.0"
description = "Python SDK for wrapping the Google SecOps API for common use cases"
readme = "README.md"
requires-python = ">=3.7"
Expand Down
Loading