Reduced caching#64
Conversation
|
|
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
Summary of ChangesHello @MarvelNwachukwu, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed! This pull request focuses on enhancing data freshness by adjusting the caching strategy for market statistics. It removes a Next.js Highlights
🧠 New Feature in Public Preview: You can now enable Memory to help Gemini Code Assist learn from your team's feedback. This makes future code reviews more consistent and personalized to your project's style. Click here to enable Memory in your admin console. Using Gemini Code AssistThe full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips. Invoking Gemini You can request assistance from Gemini at any point by creating a comment using either
Customization To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a Limitations & Feedback Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here. You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension. Footnotes
|
There was a problem hiding this comment.
Code Review
This pull request aims to reduce caching by refactoring getIqStats and decreasing the cacheDuration for fetchMarketCapData. However, a critical security vulnerability was identified: the fetchMarketCapData function exposes an API key to the client-side because it's loaded from a NEXT_PUBLIC_ environment variable. This allows any user to steal and abuse the key. It is crucial to move this key to a private, server-side only environment variable.
| @@ -1,5 +1,5 @@ | |||
| #!/bin/sh | |||
| . "$(dirname "$0")/_/husky.sh" | |||
| # . "$(dirname "$0")/_/husky.sh" // Depreciated in husky v10, TODO: Remove this permanently | |||
There was a problem hiding this comment.
| params: { | ||
| url: "https://pro-api.coingecko.com/api/v3/coins/markets?vs_currency=usd&ids=everipedia", | ||
| cacheDuration: CACHE_DURATION_SECONDS_24_HR_IN_SECONDS, | ||
| cacheDuration: CACHE_DURATION_SECONDS_5_MIN_IN_SECONDS, |
There was a problem hiding this comment.
Is this in sync with: https://github.com/IQAIcom/iq-ai-landing/pull/51/changes
There was a problem hiding this comment.
Yessir, they're both 5 minutes cache duration now
|
Just to clarify, is 5 mins the agreed duration for cache? |
| } from "./_schema"; | ||
|
|
||
| const CACHE_DURATION_SECONDS_24_HR_IN_SECONDS = 24 * 60 * 60; | ||
| const CACHE_DURATION_SECONDS_5_MIN_IN_SECONDS = 5 * 60; |
There was a problem hiding this comment.
5 mins cache duration ? Are you sure about that ?
We run a risk of maxing out the credits faster with this change, giving the fact that we are using the api across several projects.
It's important you confirm before going ahead with this.
There was a problem hiding this comment.
Yep, that's why I ask for both your review before going ahead with this. The agent trading platform requests IQ price, eth and frax I believe with a cache duration of 1 minute 😅
In theory, this would always hit cached prices from IQ, no? @Adebesin-Cell
There was a problem hiding this comment.
We also use 5 minutes for the IQAI landing
I see IQWiki uses 12 hrs
There was a problem hiding this comment.
That's true. I think we should take a moment before merging. Let's test and observe the outcomes first. We can set up a dummy API that sends some data, and we can monitor the cache. This way, we can replicate our situation and evaluate the trade-offs
There was a problem hiding this comment.
If it takes too much time, like Kesar said:
if it takes too long dont need to overcomplicate stuff. figure out the one that has the longest cache and try to set similar one. its fine if 2 sites in our list of website has a slighly different number. probably bother nobody but us
There was a problem hiding this comment.
Longest cache is this landing page, 24hrs...We could set all landing pages to 12 hrs like wiki. 24 is alot with how price moves these days
There was a problem hiding this comment.
Also, check the ext-api and see if there's a key with caching.
There was a problem hiding this comment.
Sounds good 👍. @Damola18 wdy think?
Yeah, we can do that. We can change to 6 hours after monitoring over the next few days.
Changes made: