Skip to content

Commit 33e7d79

Browse files
Fix: Correct typos in README (#295)
Hey, our tool caught a few typos in your repository. Also, a few typos on your site like 'scaleability'. Here is your error report: https://triplechecker.com/s/KhnZEJ/unstructured.io Hope it's helpful!
1 parent ab84dcb commit 33e7d79

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

README.md

100755100644
Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -316,7 +316,7 @@ with UnstructuredClient() as uc_client:
316316

317317
</br>
318318

319-
The same SDK client can also be used to make asychronous requests by importing asyncio.
319+
The same SDK client can also be used to make asynchronous requests by importing asyncio.
320320
```python
321321
# Asynchronous Example
322322
import asyncio
@@ -351,7 +351,7 @@ See [page splitting](https://docs.unstructured.io/api-reference/api-services/sdk
351351
In order to speed up processing of large PDF files, the client splits up PDFs into smaller files, sends these to the API concurrently, and recombines the results. `split_pdf_page` can be set to `False` to disable this.
352352

353353
The amount of workers utilized for splitting PDFs is dictated by the `split_pdf_concurrency_level` parameter, with a default of 5 and a maximum of 15 to keep resource usage and costs in check. The splitting process leverages `asyncio` to manage concurrency effectively.
354-
The size of each batch of pages (ranging from 2 to 20) is internally determined based on the concurrency level and the total number of pages in the document. Because the splitting process uses `asyncio` the client can encouter event loop issues if it is nested in another async runner, like running in a `gevent` spawned task. Instead, this is safe to run in multiprocessing workers (e.g., using `multiprocessing.Pool` with `fork` context).
354+
The size of each batch of pages (ranging from 2 to 20) is internally determined based on the concurrency level and the total number of pages in the document. Because the splitting process uses `asyncio` the client can encounter event loop issues if it is nested in another async runner, like running in a `gevent` spawned task. Instead, this is safe to run in multiprocessing workers (e.g., using `multiprocessing.Pool` with `fork` context).
355355

356356
Example:
357357
```python

0 commit comments

Comments
 (0)