• 0 Posts
  • 49 Comments
Joined 1 year ago
cake
Cake day: June 21st, 2023

help-circle






  • With this technology lens, would Dune still be considered sci-fi? They have different technology sure but in many ways worse than what we have now (except for space travel), they don’t have computers and rely on hand to hand combat, their spies cannot hide mics so they hide in walls for days!

    It’s another hyper militarized universe like what the Cold War has brought but with religion and drugs :^)


  • I wanted to say that it’s hard to define exactly what is or isn’t sci-fi. Really I’m just a sci-fi enjoyer and am not qualified to say what is or isn’t sci-fi :D

    Kryptonite for me is clearly a magic rock but in the movie it is in the realm of their science. Also there was a movie where the existence of superman led to a lot of questioning on its implications in defense politics so it could fit some part of your definition I guess?

    So like superman is science-based and X-Men is also you’re right and it does clearly ask what it means to be human when there are augmented humans now. So clearly more sci-fi than superman.

    But films can be both sci-fi and fantasy. It feels like a sliding rule depending on the amount the universe is based on hardcore science. On the DNA subject, Gattaca is not fantasy but X-Men is.

    To me it feels similar to the debate about “hard magic” universes like Eragon (where every spell has a physical toll on the user, or other book series where the magic is really detailed in-universe and only mastered by experts who have to study their whole life for even a basic spell) and “soft magic” like Harry Potter where everyone can cast crucifixion spells at the speed of an automatic rifle (I’m slightly exaggerating).






  • I explored the source of file(1) and the part to determine file types of text file seems to be in text.c: https://cvsweb.openbsd.org/cgi-bin/cvsweb/~checkout~/src/usr.bin/file/text.c?rev=1.3&content-type=text/plain

    And especially this part:

    static int
    text_try_test(const void *base, size_t size, int (*f)(u_char))
    {
    	const u_char	*data = base;
    	size_t		 offset;
    
    	for (offset = 0; offset < size; offset++) {
    		if (!f(data[offset]))
    			return (0);
    	}
    	return (1);
    }
    
    const char *
    text_get_type(const void *base, size_t size)
    {
    	if (text_try_test(base, size, text_is_ascii))
    		return ("ASCII");
    	if (text_try_test(base, size, text_is_latin1))
    		return ("ISO-8859");
    	if (text_try_test(base, size, text_is_extended))
    		return ("Non-ISO extended-ASCII");
    	return (NULL);
    }
    

    So file(1) is not capable of saying if a file is UTF-8 right now. There is some other file (/etc/magic) which can help to determine if a text file is UTF-7 or UTF-8-EBCDIC because those need a BOM but as you said UTF-8 does not need a BOM. So it looks like we are stuck here :)




  • I think there is an argument to be made that if you want to develop a game, for example, for the PS5 you can rely hone your game to the PS5 hardware and it could be extremely stable. This is not possible for PCs because PCs do not have fixed hardware.

    However I think this was true in the olden days of the SNES where games where not glitchy compared to DOS gaming where hardware compatibility was all over the place. You can see this on YouTube channels like LGR where finding a compatible sound card is a challenge.

    But like you, I don’t find that this is still true for modern PC gaming.