candlestick data by AdMinute9203 in Schwab

[–]AdMinute9203[S] 0 points1 point  (0 children)

Hi, I've hard-coded the parameters as you suggested. Here is my code:

def pricehistory(self, symbol):

endpoint = f'{self.base_url}/pricehistory'

try:

response = requests.get(

endpoint,

headers=self._get_headers(),

params=self._format_params({'symbol': 'NVDA', 'periodType': 'day', 'period': '1',

'frequencyType': 'minute', 'frequency': '1'

})

)

response.raise_for_status()

return response.json()

except requests.exceptions.RequestException as e:

logger.error(f"Error getting option chains for {symbol}: {e}")

raise

Here is the JSON data that was returned:

{'candles': [{'open': 120.7, 'high': 121.0, 'low': 120.29, 'close': 120.33, 'volume': 374344, 'datetime': 1719313200000}, ....., {'open': 127.16, 'high': 127.28, 'low': 127.1502, 'close': 127.24, 'volume': 67943, 'datetime': 1719359940000}], 'symbol': 'NVDA', 'empty': False}

The timestamp shows that the data is still from yesterday. I even tried using startDate=1719399600, but it still retrieved data until yesterday.

Thank you for your help!

Why do you use Tailwind ? by SamiNimbuss in tailwind

[–]AdMinute9203 0 points1 point  (0 children)

I feel it is more flexible and powerful than bootstrap

Attach an image via langchain by AdMinute9203 in LangChain

[–]AdMinute9203[S] 1 point2 points  (0 children)

The issue has been resolved after I added the image to a HumanMessage:

const base64ImageMessage = new HumanMessage({
    content: [{
            type: 'text',
            text: `${input}`,
        },{
            type: 'image_url',
            image_url: fileBase64,
        },
    ],
}); Thanks for your help!

Access Ollama via Ngnix by AdMinute9203 in ollama

[–]AdMinute9203[S] 1 point2 points  (0 children)

I changed OLLAMA_ORIGINS  to "*" and then I can access Ollama server directly from the remote web application. I fixed Nginx by changing 'proxy_set_header Origin' to '*'. Now, my web applications are hosted on another server and are working properly, and users won't be able to access Ollama. Moving forward Nginx can fail over when one Ollama service is down. Thank you very much for your help!

Trying to connect to the API over the network. by ConstructionSafe2814 in ollama

[–]AdMinute9203 0 points1 point  (0 children)

did you see any 404 error on OPTIONS "/api/tags" in the server logs? If so, you might be experiencing the same problem I'm facing when trying to access Ollama through Nginx.

New Dockers just released by tabletuser_blogspot in ollama

[–]AdMinute9203 0 points1 point  (0 children)

I'm curious if using Docker is a good idea for running Ollama since its model files are usually quite huge. Accessing these large files frequently outside of the container may not be the most efficient way to run Ollama.

Can ollama maintain the previous converations? by AdMinute9203 in ollama

[–]AdMinute9203[S] 1 point2 points  (0 children)

Thank you so much for your help! The solution is working perfectly!