elvith

joined 1 year ago
[–] elvith@feddit.de 4 points 5 months ago

It's referring to both. The recompiler links to the Zelda project and basically tells you "if you want to haven an example how to.proceed/what to implement yourself after the recompilation finished, you can use the Zelda project as an example".

[–] elvith@feddit.de 15 points 5 months ago (2 children)

Well, usually those re-compilers or transpilers just translate the binary to some sort of intermediate language and then any backend should be able to compile it for your target system. So, in theory those handheld could be targeted. Problem with this project is that it's not just "start transpiler, load rom, click go and your port is ready". It's more like "ok, here's your game logic. Now implement the rest (or use several other projects and duct tape their libraries together to get what you want).

[–] elvith@feddit.de 9 points 6 months ago

Fifteen Million Merits, IIRC?

[–] elvith@feddit.de 21 points 6 months ago

Nah, it just marks your question as duplicate.

[–] elvith@feddit.de 10 points 6 months ago (2 children)

11 in binary is 3, so....

[–] elvith@feddit.de 7 points 6 months ago

Let me guess - long distance is if it's outside the prison? /s

[–] elvith@feddit.de 1 points 6 months ago (1 children)

'Tis the season to be jolly
Fa la la la la fa la la la
Don't they ~~know~~ bear the gay ~~apparel~~ appearance
Faaa Fa la la la la la
Tilt the all time right wing audience
Fa la la fa la la la laaaaaaa!

[–] elvith@feddit.de 6 points 6 months ago

If you edit the ini file you can unlock (or increase?) the frame rate limit so that isn't stuck at 30 fps

 

Abstract

Consent plays a profound role in nearly all privacy laws. As Professor Heidi Hurd aptly said, consent works “moral magic” – it transforms things that would be illegal and immoral into lawful and legitimate activities. As to privacy, consent authorizes and legitimizes a wide range of data collection and processing.

There are generally two approaches to consent in privacy law. In the United States, the notice-and-choice approach predominates; organizations post a notice of their privacy practices and people are deemed to consent if they continue to do business with the organization or fail to opt out. In the European Union, the General Data Protection Regulation (GDPR) uses the express consent approach, where people must voluntarily and affirmatively consent.

Both approaches fail. The evidence of actual consent is non-existent under the notice-and-choice approach. Individuals are often pressured or manipulated, undermining the validity of their consent. The express consent approach also suffers from these problems – people are ill-equipped to decide about their privacy, and even experts cannot fully understand what algorithms will do with personal data. Express consent also is highly impractical; it inundates individuals with consent requests from thousands of organizations. Express consent cannot scale.

In this Article, I contend that most of the time, privacy consent is fictitious. Privacy law should take a new approach to consent that I call “murky consent.” Traditionally, consent has been binary – an on/off switch – but murky consent exists in the shadowy middle ground between full consent and no consent. Murky consent embraces the fact that consent in privacy is largely a set of fictions and is at best highly dubious.

Because it conceptualizes consent as mostly fictional, murky consent recognizes its lack of legitimacy. To return to Hurd’s analogy, murky consent is consent without magic. Rather than provide extensive legitimacy and power, murky consent should authorize only a very restricted and weak license to use data. Murky consent should be subject to extensive regulatory oversight with an ever-present risk that it could be deemed invalid. Murky consent should rest on shaky ground. Because the law pretends people are consenting, the law’s goal should be to ensure that what people are consenting to is good. Doing so promotes the integrity of the fictions of consent. I propose four duties to achieve this end: (1) duty to obtain consent appropriately; (2) duty to avoid thwarting reasonable expectations; (3) duty of loyalty; and (4) duty to avoid unreasonable risk. The law can’t make the tale of privacy consent less fictional, but with these duties, the law can ensure the story ends well.

[–] elvith@feddit.de 11 points 6 months ago (1 children)

I mean, the hosting company would be the likely target then and they'd probably lock your account and switch off the server. Depending on your nationality and that of the hoster, at least.

[–] elvith@feddit.de 3 points 6 months ago* (last edited 6 months ago) (1 children)

I found a blog post outlining exactly that. If you use it locally, it will install and start a service temporarily. That service runs as SYSTEM and invokes your command. To succeed, you need to be a local administrator.

If you try the same remote, it tries to access \\remote-server-ip\$admin and installs the service with that. To succeed your current account on your local machine must exist on the remote machine and must be an administrator there.

So in short: It only works, if you've already the privilege to do so and the tool itself is not (ab)using a privilege escalation or something like that. Any hacker and virus may do the very same and doesn't need psexec - it's just easier for them to use that tool.

[–] elvith@feddit.de 5 points 6 months ago (3 children)

Never thought about that, but since these tools just work, when you copy them to your PC.... how does psexec do that? It'd either need you to be an administrator (and then it's not really a privilege escalation as you could have registered any program into the task scheduler or as a service to run as SYSTEM) or it'd need a delegate service, that should only be available when you use an installer - which again wasn't was has been done when just copying the tool.

[–] elvith@feddit.de 11 points 6 months ago (6 children)

Also please pre-install the sysinternals suite, thanks

 

The Wall Street Journal reported that Meta plans to move to a "Pay for your Rights" model, where EU users will have to pay $ 168 a year (€ 160 a year) if they don't agree to give up their fundamental right to privacy on platforms such as Instagram and Facebook. History has shown that Meta's regulator, the Irish DPC, is likely to agree to any way that Meta can bypass the GDPR. However, the company may also be able to use six words from a recent Court of Justice (CJEU) ruling to support its approach.

view more: next ›