Skip to content

Commit 4459e54

Browse files
committed
docs: cleanup URLs from docs and code comments
1 parent 9498674 commit 4459e54

File tree

2 files changed

+2
-2
lines changed

2 files changed

+2
-2
lines changed

model_garden/anthropic/anthropic_batchpredict_with_bq.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@ def generate_content(output_uri: str) -> str:
2626
# output_uri = f"bq://your-project.your_dataset.your_table"
2727

2828
job = client.batches.create(
29-
# Check Anthropic Claude region availability in https://cloud.devsite.corp.google.com/vertex-ai/generative-ai/docs/partner-models/use-claude#regions
29+
# Check Anthropic Claude region availability in https://cloud.google.com/vertex-ai/generative-ai/docs/partner-models/use-claude#regions
3030
# More about Anthropic model: https://console.cloud.google.com/vertex-ai/publishers/anthropic/model-garden/claude-3-5-haiku
3131
model="publishers/anthropic/models/claude-3-5-haiku",
3232
# The source dataset needs to be created specifically in us-east5

pubsublite/spark-connector/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -193,7 +193,7 @@ Here is an example output: <!--TODO: update attributes field output with the nex
193193
[Install Python and virtualenv]: https://cloud.google.com/python/setup/
194194
[Cloud Console for Dataproc]: https://console.cloud.google.com/dataproc/
195195

196-
[Create Cluster]: https://pantheon.corp.google.com/dataproc/clustersAdd
196+
[Create Cluster]: https://console.cloud.google.com/dataproc/clustersAdd
197197
[Dataproc Image Version 1.5]: https://cloud.google.com/dataproc/docs/concepts/versioning/dataproc-release-1.5
198198
[Dataproc Image Version 2.0]: https://cloud.google.com/dataproc/docs/concepts/versioning/dataproc-release-2.0
199199
[compatibility]: gs://spark-lib/pubsublite/pubsublite-spark-sql-streaming-LATEST-with-dependencies.jar

0 commit comments

Comments
 (0)