Thinking about a new Mac...
-
Not arguing whether 8GB or 16GB RAM is enough, just reading up a bit on the M1’s “unified memory architecture”:
https://www.macworld.com/article/3597569/m1-macs-memory-isnt-what-it-used-to-be.html
https://www.howtogeek.com/701804/how-unified-memory-speeds-up-apples-m1-arm-macs/
-
@george-k said in Thinking about a new Mac...:
@Klaus has said that 16GB of RAM is inadequate. However, I've read that with the new chip, even 8GB is good for most "routine" stuff.
It's not "inadequate". But I had 16GB of RAM in my laptop in 2010. Would you still be happy with a harddrive from 2010? Why are you happy with a memory configuration from 2010?
RAM is dirt cheap these days, and it makes a big difference in performance in many situations (even if only having many browser windows open). It's one of the best "bang for the buck" investments - way better than, say, a higher clock frequency in the CPU.
@klaus said in Thinking about a new Mac...:
It's not "inadequate". But I had 16GB of RAM in my laptop in 2010. Would you still be happy with a harddrive from 2010?
False assumption. 16GB of traditional RAM is not the same as 16GB of RAM used by SoC architecture. You can't numerically compare machines from ten years ago to today without taking the hardware changes into consideration.
-
I expect my lights to dim every time Mark turns on his machine.
No problems with my computer, 16 gb ram. What I find weird is Windows updates which cause goofy things to occur like rebooting adds opening chrome in pages I viewed weeks ago. I watch and marvel when rebooting, maybe this time Word will open up automatically with a document from weeks ago.
Back to highbrow computer nerd stuff. Interesting to those that follow the thread, even if most of it is like a different language.
-
@klaus said in Thinking about a new Mac...:
It's not "inadequate". But I had 16GB of RAM in my laptop in 2010. Would you still be happy with a harddrive from 2010?
False assumption. 16GB of traditional RAM is not the same as 16GB of RAM used by SoC architecture. You can't numerically compare machines from ten years ago to today without taking the hardware changes into consideration.
@aqua-letifer said in Thinking about a new Mac...:
@klaus said in Thinking about a new Mac...:
It's not "inadequate". But I had 16GB of RAM in my laptop in 2010. Would you still be happy with a harddrive from 2010?
False assumption. 16GB of traditional RAM is not the same as 16GB of RAM used by SoC architecture. You can't numerically compare machines from ten years ago to today without taking the hardware changes into consideration.
Hu? Of course you can. It doesn't matter where the RAM is located. Things like access times, cache sizes etc. matter, too, but the amount of RAM is maybe the most important parameter, since if you run out of memory, the performance penalty is extreme. Also, neither the M1 nor Intel Core and nor the Intel chips from 2010 are "SoC", so what are you talking about?
-
@aqua-letifer said in Thinking about a new Mac...:
@klaus said in Thinking about a new Mac...:
It's not "inadequate". But I had 16GB of RAM in my laptop in 2010. Would you still be happy with a harddrive from 2010?
False assumption. 16GB of traditional RAM is not the same as 16GB of RAM used by SoC architecture. You can't numerically compare machines from ten years ago to today without taking the hardware changes into consideration.
Hu? Of course you can. It doesn't matter where the RAM is located. Things like access times, cache sizes etc. matter, too, but the amount of RAM is maybe the most important parameter, since if you run out of memory, the performance penalty is extreme. Also, neither the M1 nor Intel Core and nor the Intel chips from 2010 are "SoC", so what are you talking about?
@klaus said in Thinking about a new Mac...:
Hu? Of course you can. It doesn't matter where the RAM is located.
They're literally saying the opposite. Read Ax's articles he posted.
-
There’s a line in The Expanse where Avasarala Says something to the effect of “I wish these guys would stop waving their dicks at each other.”
@george-k said in Thinking about a new Mac...:
There’s a line in The Expanse where Avasarala Says something to the effect of “I wish these guys would stop waving their dicks at each other.”
Not me. I didn't get a monster machine, and Mark and Klaus would find the specs abysmal. But I use my computer more heavily than average, and I'm not at all worried about the new one crapping out on me.
-
There’s a line in The Expanse where Avasarala Says something to the effect of “I wish these guys would stop waving their dicks at each other.”
-
@klaus said in Thinking about a new Mac...:
Hu? Of course you can. It doesn't matter where the RAM is located.
They're literally saying the opposite. Read Ax's articles he posted.
@aqua-letifer said in Thinking about a new Mac...:
@klaus said in Thinking about a new Mac...:
Hu? Of course you can. It doesn't matter where the RAM is located.
They're literally saying the opposite. Read Ax's articles he posted.
I did read the articles. The M1's "unified memory architecture" stuff doesn't change any of the considerations for more RAM. It's about sharing the RAM in a more flexible way (which is mostly a moot point if one has an external graphics card). What this basically means is that you run out of RAM on the M1 faster than on traditional architectures. But in any case, it doesn't change anything about the problem that if your applications require X+Y bytes of RAM but you only have X bytes available, you have a big problem.
For instance, if you multiply two numbers that each require, say, 10GB of memory, then a system with 32GB of memory will be orders of magnitude (say, 10x or 100x) faster than a system with 8GB or 16GB of memory, and it doesn't matter one bit whether it is a system from 2010 or an M1 from 2020.
Don't be so easily impressed by tech advertising BS.
-
@klaus said in Thinking about a new Mac...:
which is mostly a moot point if one has an external graphics card
From the article: Like Intel chips with integrated graphics, the M1 chip includes a graphics processor
What this basically means is that you run out of RAM on the M1 faster than on traditional architectures.
Again from the article: But Apple isn’t integrating memory into its systems-on-a-chip out of spite. It’s doing it because it’s an approach that can lead to some dramatic speed benefits. ... because all the aspects of the processor can access all of the system memory, there’s no performance hit when the graphics cores need to access something that was previously being accessed by a processor core. On other systems, the data has to be copied from one portion of memory to another—but on the M1, it’s just instantly accessible.
-
I have got my M1 MacBook Air with 16GB RAM for maybe two weeks by now. Haven’t use it much yet. Why? Because I have other Macs that are working fine so I don’t have to rush to move everything over to the M1 Air. I don’t bother to install any third party software that does not have native M1 support on it, so I don’t have to deal with Rosetta-2 translated junk that I may not be able to clean up later.
So far I’ve only got Apple’s own pre-installed stuff, Chrome, FireFox, and GraphicConverter on the M1 Air. Yes, running only these native apps, it does feel faster/more responsive compared to my other Intel Air that also has 16 GB RAM.
I have just learnt that Microsoft released native M1 versions of their core Office/365 applications, but I haven’t got around to install these yet.
I really look forward to having the native M1 version for these few things: OBS, Go (programming language/compiler) support, Dell multifunction printer device driver, Zoom, Microsoft Team, Finale (music notation program), Adobe’s Creative Cloud applications (though short term I use only Illustrator).
OBS is about real-time video encoding, that can really make use of fast CPU.
After I get Go with native M1 support, I might try a simple performance comparison using a Go program I wrote to work out one of Jon or Klaus’ puzzles.
I expect Dell’s printer driver with native M1 support will come in last, or such support may be punted by Dell.
-
@klaus said in Thinking about a new Mac...:
which is mostly a moot point if one has an external graphics card
From the article: Like Intel chips with integrated graphics, the M1 chip includes a graphics processor
What this basically means is that you run out of RAM on the M1 faster than on traditional architectures.
Again from the article: But Apple isn’t integrating memory into its systems-on-a-chip out of spite. It’s doing it because it’s an approach that can lead to some dramatic speed benefits. ... because all the aspects of the processor can access all of the system memory, there’s no performance hit when the graphics cores need to access something that was previously being accessed by a processor core. On other systems, the data has to be copied from one portion of memory to another—but on the M1, it’s just instantly accessible.
@aqua-letifer said in Thinking about a new Mac...:
@klaus said in Thinking about a new Mac...:
which is mostly a moot point if one has an external graphics card
From the article: Like Intel chips with integrated graphics, the M1 chip includes a graphics processor
What this basically means is that you run out of RAM on the M1 faster than on traditional architectures.
Again from the article: But Apple isn’t integrating memory into its systems-on-a-chip out of spite. It’s doing it because it’s an approach that can lead to some dramatic speed benefits. ... because all the aspects of the processor can access all of the system memory, there’s no performance hit when the graphics cores need to access something that was previously being accessed by a processor core. On other systems, the data has to be copied from one portion of memory to another—but on the M1, it’s just instantly accessible.
So...? That doesn't contradict anything I said. And it doesn't mean that you can't compare 16GB 2010 RAM with 16 GB 2020 M1 RAM. Do you understand what the article says? I have the impression that you are only reacting to single catch phrases but not really parsing the sentences.
-
@aqua-letifer said in Thinking about a new Mac...:
@klaus said in Thinking about a new Mac...:
which is mostly a moot point if one has an external graphics card
From the article: Like Intel chips with integrated graphics, the M1 chip includes a graphics processor
What this basically means is that you run out of RAM on the M1 faster than on traditional architectures.
Again from the article: But Apple isn’t integrating memory into its systems-on-a-chip out of spite. It’s doing it because it’s an approach that can lead to some dramatic speed benefits. ... because all the aspects of the processor can access all of the system memory, there’s no performance hit when the graphics cores need to access something that was previously being accessed by a processor core. On other systems, the data has to be copied from one portion of memory to another—but on the M1, it’s just instantly accessible.
So...? That doesn't contradict anything I said. And it doesn't mean that you can't compare 16GB 2010 RAM with 16 GB 2020 M1 RAM. Do you understand what the article says? I have the impression that you are only reacting to single catch phrases but not really parsing the sentences.
@klaus said in Thinking about a new Mac...:
So...? That doesn't contradict anything I said. And it doesn't mean that you can't compare 16GB 2010 RAM with 16 GB 2020 M1 RAM. Do you understand what the article says? I have the impression that you are only reacting to single catch phrases but not really parsing the sentences.
That's right, Klaus, I'm an idiot. I'm also completely unaware of your attempt to railroad the conversation by focusing on narrow parameters you've tried to set because only you care about them.
-
@klaus said in Thinking about a new Mac...:
So...? That doesn't contradict anything I said. And it doesn't mean that you can't compare 16GB 2010 RAM with 16 GB 2020 M1 RAM. Do you understand what the article says? I have the impression that you are only reacting to single catch phrases but not really parsing the sentences.
That's right, Klaus, I'm an idiot. I'm also completely unaware of your attempt to railroad the conversation by focusing on narrow parameters you've tried to set because only you care about them.
@aqua-letifer said in Thinking about a new Mac...:
@klaus said in Thinking about a new Mac...:
So...? That doesn't contradict anything I said. And it doesn't mean that you can't compare 16GB 2010 RAM with 16 GB 2020 M1 RAM. Do you understand what the article says? I have the impression that you are only reacting to single catch phrases but not really parsing the sentences.
That's right, Klaus, I'm an idiot. I'm also completely unaware of your attempt to railroad the conversation by focusing on narrow parameters you've tried to set because only you care about them.
It was you who chose to attack my statement that I can't compare memory sizes.
I happen to be pretty well educated about computer architecture. I don't blame anyone for not knowing much about the subject, but if you choose to attack my statement then it would be more convincing if your knowledge on the matter exceeded a few buzzwords you read in an article.
-
@jon-nyc said in Thinking about a new Mac...:
I've been in movie theaters with smaller screens. lol
I decided it wasn't big enough and just ordered a 49" for the big gaming rig.
https://www.amazon.com/gp/product/B07L9HCJ2V/ref=ppx_yo_dt_b_asin_title_o01_s00?ie=UTF8&psc=1
5120 x 1440 at 120Hz
I also scored a top of the line AMD GPU last night. It has 16 Gigabytes of ram. The SAPPHIRE Radeon RX 6900 XT. I was one of over 10,000 people doing the refresh battle to wait for the "Out Of Stock" indicator to change to an "Add to Cart" button. I was prepared as the previous attempt to get one, I was not successful due to delays for setting up the account, google-pay etc. You can add the item to your cart, but it is not reserved for you until you actually check out.
Last night I must have won the initial refresh battle because as soon as the page loaded, the "Add to Cart" button appeared! I was in shock. I clicked it, When the new page appeared I clicked three check boxes to agree to the terms, clicked G-Pay button, clicked Pay Now on Google Pay and what do you know? I GOT IT!
So now I am going to have two kick-ass gaming rigs. The other GPU I have on backorder is an Nvidia ASUS ROG Strix RTX 3080 OC. It is 2nd only to the RTX 3090 which costs anywhere from $1,500 to $2,400. The 3080 only has 10 Gigs of ram, but the Ray tracing and DLSS (Upscaling) is currently superior to the AMD RX 6900 XT.
The 49" monitor should be about the limit of what I would ever want in a gaming monitor.
On the business side of the equation I will be able to fit three or four, full page code windows on the screen at once.
@mark said in Thinking about a new Mac...:
@jon-nyc said in Thinking about a new Mac...:
I've been in movie theaters with smaller screens. lol
I decided it wasn't big enough and just ordered a 49" for the big gaming rig.
https://www.amazon.com/gp/product/B07L9HCJ2V/ref=ppx_yo_dt_b_asin_title_o01_s00?ie=UTF8&psc=1
5120 x 1440 at 120Hz
The 49" monitor should be about the limit of what I would ever want in a gaming monitor.
Well, that didn't last long. lol
I got to thinking about the refresh rate of 120hz and 4ms GTG rating, and how in 2020 Samsung released a version of this monitor with a 240hz refresh rate and 1ms (GTG).
Because my new GPUs will easily handle high refresh rate 1440 resolutions, I decided that I wanted the monitor to be able to handle the higher refresh rates. The faster GTG (Gray to Gray) rating is not something I will probably notice and the measurement varies from manufacturer to manufacturer, so it is mostly irrelevant except that it does permit the pixel to remain at it's assigned value longer than a slower rate so "ghosting" and motion blur is reduced. Most human eyes will never see the difference. But, it is faster than the same model from the previous year so at least the testing procedure of GTG is consistent here.
So, I cancelled the order for the CRG9 and ordered the Odyssey G9
5120 x 1440 @ 240 Hz 1ms (GTG).
HDMI, DisplayPort, and USB 3.0.
2500:1 Contrast Ratio
Display Colors 1.07 Billion -
@aqua-letifer good for you!
Let us know how it works out, please! @Klaus has said that 16GB of RAM is inadequate. However, I've read that with the new chip, even 8GB is good for most "routine" stuff. The battery life is also another big deal.
Just looking at cost, the MacBook Air, with an external monitor still comes out just a bit cheaper than the iMac if similarly spaced (RAM and SSD), but the convenience of portability is a big factor. The Mac mini is still cheaper, but no portability.
OTOH, for my needs, my iPad does everything I need for portability.
My only concern is the I/O - I have an external 4-bay hard drive enclosure that houses all of my 1) Music/Books 2) Time Machine 3) Time Machine 4) Additional backup. I also need to hook up keyboard and mouse. The mini would do the job, but the portability of the MacBook Air is a strong pull, as long as I can get everything I have now.
I'll probably wait until later
thisnext year and see how the rumored new iMacs spec out. As I've said, my eyes probably won't suffer for a slightly inferior display and I might save a couple of hundred bucks by going with the mini or Air.@george-k said in Thinking about a new Mac...:
@aqua-letifer good for you!
Let us know how it works out, please!Just got it in the mail. But it's so cold from being on the UPS truck for hours that I'm actually afraid to turn it on. I'm letting it thaw out first.
It'll be an interesting experiment. I'm firmly in the pocket of Microsoft now, and a lot of my shit is on OneDrive. So we'll see how easy transitioning is going to be without working from a Mac backup.
-
@George-K Okay, posting now using the thing.
It's fine. I mean the thing just works. Premiere Pro and After Effects are both running just fine on it.
The keyboard works great but it's got that kind of surface that gets really dingy in a hurry.
Display is awesome. Trackpad's very generous.
Cons:
- No ports whatsoever. I mean holy shit, it's got one damn USB-C when you have the charger plugged in. Oh and a headphone jack.
Because let's bring that back.
- Adobe hasn't yet made new Mac-friendly versions of their software, so you have to use Rosetta to run everything. Huh. They're working on it, though, and I doubt many people care.
- No ports whatsoever. I mean holy shit, it's got one damn USB-C when you have the charger plugged in. Oh and a headphone jack.
-
@George-K Okay, posting now using the thing.
It's fine. I mean the thing just works. Premiere Pro and After Effects are both running just fine on it.
The keyboard works great but it's got that kind of surface that gets really dingy in a hurry.
Display is awesome. Trackpad's very generous.
Cons:
- No ports whatsoever. I mean holy shit, it's got one damn USB-C when you have the charger plugged in. Oh and a headphone jack.
Because let's bring that back.
- Adobe hasn't yet made new Mac-friendly versions of their software, so you have to use Rosetta to run everything. Huh. They're working on it, though, and I doubt many people care.
- No ports whatsoever. I mean holy shit, it's got one damn USB-C when you have the charger plugged in. Oh and a headphone jack.
-