What I Learnt This Week part 5
Week 5
This is my brain dump for this week on things I learned, tech-related or otherwise.
Tech
- CSS sprites allow for multiple images to be combined into one which can be separated by using css styles for each image. It makes it so that the caching for images is more effective and you save server response time when not having to query it for multiple images since a combined sprite is retrieved.
- Meta is in the process of ending third-party fact-checking and moving to a community-based model. I can’t lie I haven’t used Facebook in ages, but this also impacts all meta products, although I don’t see as much false information on Instagram as compared to something like Twitter (we won’t be calling it X). The community fact-checking has worked out pretty well for Twitter considering Elon is the most fact-checked individual on that app.
- Great post https://www.seangoedecke.com/large-established-codebases/ highlights the importance of consistency in writing features for large code bases. You should always look at similar patterns in the codebase before implementing your own because this eliminates many edge cases in how things are handled for really large code repos, which would avoid a lot of headaches and bug fixing post integrating the feature if done with your implementation. This applies to even the most naive and simple feature additions, always look at a similar pattern for the feature you are implementing.
- C adds padding to structs with differently sized elements, so the size of the struct is not always equal to the sum of size of its elements. This is done to satisfy CPU alignment constraints which impact performance and correctness of code. If youre looking to maximize memory optimality, grouping smaller elements and sorting the members by alignment can help. Compilers can pack structures (__packed directive) can however cause issues when ported to different architectures.
- the open-source aspect of Deepseek is very fascinating. They were in a very unique position, having their findings and running on a fraction of the hardware of their competitors, but they probably had their doubts about the adoption of their models because of skepticism in the West when it comes to Chinese tech. Open-sourcing their model meant that people wouldn’t be afraid to try out and possibly extend and integrate into their own use cases. I think it’s a brilliant move but the reaction when it comes to Nvidia stocks, or thinking this is the end of OpenAI is an exaggeration. Maybe it will force “open” AI to finally open source GPT 4 and live up to their name or maybe they have newer developments, only time will tell!
Otherwise
- new york: they weren’t lying when they said new york has a certain magic that makes you feel like you can accomplish anything. Maybe this is me looking at it through rose colored glasses, but there was something in the air, that made me feel like I belonged there, especially at this time in my life, my early 20s. The fast paced life is such a contrast to LA, which is super laid back and day to day. Definitely a must visit in my books. Food was GREAT!
Previous Dec 29, 2024
« What I Learnt This Week part 4
« What I Learnt This Week part 4
Jan 30, 2025 Next
What I Learnt This Week part 6 »
What I Learnt This Week part 6 »