Hacker News .hnnew | past | comments | ask | show | jobs | submit | Delitio's commentslogin

Do you have any sources for this?


Fidel Castro, Kim jong Il, and Vladimir Putin.


Those are the only three instances where sanctions were used "ever in human history"?


Most recent attempts thus most applicable to today’s world. Yes.


The sanctions of the last 8 years have probably played a significant role in diminishing the RF's capacity to effectively occupy Ukraine.


OTOH, Russia successfully elected a POTUS that helped cripple Ukraine’s ability to defend itself.


Nope it's not ridiculous. If you are only allowed to store data for x month that's it.

It's your job to use technics which allow you to do this like using encryption on your backup and deleting the keys for it, for example.


Delitio> If you are only allowed to store data for x month that's it.

Exactly. I'm not aware of any laws saying "you must delete this data immediately". More like "within X days or months". The permanently delete thing presumably skips some cooling-off period in the online database but not the backup, which seems perfectly appropriate, provided your backup retention is compliant.

Google has a nice page describing out their deletion process. [1] It doesn't go into product-specific technical details/steps (like marked as deleted within the product, row deleted from Bigtable/Spanner, major compaction guaranteed to happen, backups guaranteed to be deleted or unusable) but it says this:

Google> We then begin a process designed to safely and completely delete the data from our storage systems. Safe deletion is important to protect our users and customers from accidental data loss. Complete deletion of data from our servers is equally important for users’ peace of mind. This process generally takes around 2 months from the time of deletion. This often includes up to a month-long recovery period in case the data was removed unintentionally.

This is a best practice.

Delitio> It's your job to use technics which allow you to do this like using encryption on your backup and deleting the keys for it, for example.

If they'd thrown away the encryption key immediately, this would have been much worse. Instead of "we're down for 2 weeks?!?" (already quite bad) it'd be "our data is gone forever?!?". You never want to delete anything too quickly for exactly this reason.

[1] https://policies.google.com/technologies/retention?hl=en-US


The calculation is very thin in my opinion.

10k of infra costs are not a lot of money in a business context.

A person to operate a private cloud with OnCall, backup hardware, capacity planing etc. costs what?

I would always try to have GPU on prem as those prices are quite high bit others I would use managed.

Cloud providers are just much better in operating infrastructure ad normal ops.


How did you determine the pictures he took have a high dynamic range?

I looked at those pictures and had the opposite feeling.

I was surprised how bad they are from a pure quality aspect.

I assume the jpeg and scanning is a very hard limiting factor.

I can get a lot of range when shooting raw with my canon 80d.


The analog pictures are reasonably contrasty, but they do preserve highlights in a very nice way.

If you look at the top picture in the post then I'd have more dynamic range with my DSLR for most of that picture, but the parts in direct sunlight would be completely clipped instead.


Is there any source which explains what billion of parameters actually are?

In my mind a parameter is: language, dialect, perhaps context parameters (food, dinner, lunch, travel) and if we than talk about language and audio perhaps sound waves, gender.

Or are context parameters which gives you insight? Like a billion of parameters are literally something like travel=false, travel-europe=true people speaking=e, age, height,


Parameters are just floating point numbers, at most they can be seen as degrees of freedom or kind of like the order of a polynomial used in curve fitting.

They're too abstract to assign much meaning to individual parameters, as our understanding of why their values are exactly the way they are is extremely limited.


A good visual introduction to neural networks can be found here: https://playground.tensorflow.org

A parameter is a "weight" in this case (the lines drawn from neuron to neuron). The neurons are effectively runtime values or "activations." Parameters (weights) are updated during training and then set as constant during "inference" (also called "prediction").

There's unfortunately a ton of jargon and different groups use different words almost exclusively.


A parameter is a scalar value, most of which are in the attention matrices and feedforward matrices, you also hear these called “weights”. Any intro to DL course will cover these in detail. I recommend started with Andrew Ng’s Coursera class on Intro to Machine Learning, although there may be better ones out there now.


Input parameter vs. weights then?

I see tx


These networks (text models) usually have around a few thousand inputs.


The parameters are the number of weights in a neural network, in this case.


It's rare a single parameter maps to a human understandable concept. Occasionally someone finds one that does map fairly well, for example this case back in 2017: https://openai.com/blog/unsupervised-sentiment-neuron/#senti...


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: