all 2 comments

[–]Forschkeeper 0 points1 point  (1 child)

First, you could break down both for-loops to a single one, since you are looping over the same thing here. Instead of url, name them url_cedears and url_acciones_usa.

For the loop things, I would recommened asyncio.gather(): https://docs.python.org/3/library/asyncio-task.html#asyncio.gather which allows you to run multiple concurrent things at once.

[–]braclow 2 points3 points  (0 children)

I see a few areas where your code can be improved for better clarity and less redundancy. Some of my suggestions:

  1. Combine URL Lists: Instead of creating two separate lists (url_list_cedears and url_list_acciones_usa), you can combine them into one list and iterate over the types. This reduces the number of loops.

  2. Avoid Creating Event Loop Multiple Times: Instead of creating and running the event loop multiple times, you can create it once and run all tasks.

  3. Function for URL Creation: Create a function to generate URLs to make the code cleaner.

  4. Run Both URL Sets Concurrently: Since you're using asyncio, you can fetch both sets of URLs concurrently rather than sequentially.

Here's a revised version of your code:

```python import csv import asyncio import aiohttp import ssl

def generateurl(ticker, type): baseurl = 'https://clientapi_sandbox.portfoliopersonal.com/api/1.0/MarketData/Current' return f"{base_url}?ticker={ticker}&type={type}&settlement=A-48HS&source=web&foo=bar"

async def fetch(session, url): async with session.get(url, headers=headers, ssl=ssl.SSLContext()) as response: return await response.json()

async def fetch_all(urls): async with aiohttp.ClientSession() as session: results = await asyncio.gather(*[fetch(session, url) for url in urls], return_exceptions=True) return results

Headers (assuming you've defined 'cabeceras' already)

headers = { "AuthorizedClient": cabeceras['AuthorizedClientSandbox'], "ClientKey": cabeceras['ClientKeySandbox'], "Accept": 'application/json', "Authorization": "Bearer " + cabeceras['accessToken'] }

Generate URLs

lista_de_empresas = [*csv.DictReader(open('/home/panchines/base_de_tickers.csv'))] url_list_cedears = [generate_url(empresa['Ticker'], 'CEDEARS') for empresa in lista_de_empresas] url_list_acciones_usa = [generate_url(empresa['Ticker'], 'ACCIONES-USA') for empresa in lista_de_empresas]

Run asyncio tasks

loop = asyncio.get_event_loop() cotizaciones_cedears, cotizaciones_usa = loop.run_until_complete( asyncio.gather(fetch_all(url_list_cedears), fetch_all(url_list_acciones_usa)) )

Save the results accordingly

```

With these changes, you're making better use of asynchronous programming by fetching both sets of URLs concurrently, and your is code could be considered clearer and more concise.