Take Me Away Mac OS

broken image


MacOS Big Sur elevates the most advanced desktop operating system in the world to a new level of power and beauty. Experience Mac to the fullest with a refined new design. Enjoy the biggest Safari update ever. Discover new features for Maps and Messages. And get even more transparency around your privacy. Mac OS 8.5.1, released December 7, 1998, was a minor update to Mac OS 8.5 that fixes several bugs that caused crashes and data corruption. Mac OS 8.6 edit Released May 10, 1999, Mac OS 8.6 added support to the Mac OS nanokernel to handle preemptive tasks via the Multiprocessing Services 2.x and later developer API.

  1. MacOS Big Sur
  2. Take Me Away Acoustic
  3. Take Me Away Mac Os X
  4. See Full List On Support.apple.com
  5. Mac OS Compatibility Guide By Mac Model - MacSales.com
'About This Computer' Mac OS 9.1 window showing the memory consumption of each open application and the system software itself.

Historically, the classic Mac OS used a form of memory management that has fallen out of favor in modern systems. Criticism of this approach was one of the key areas addressed by the change to Mac OS X.

The original problem for the engineers of the Macintosh was how to make optimum use of the 128 KB of RAM with which the machine was equipped, on Motorola 68000-based computer hardware that did not support virtual memory.[1] Since at that time the machine could only run one application program at a time, and there was no fixedsecondary storage, the engineers implemented a simple scheme which worked well with those particular constraints. That design choice did not scale well with the development of the machine, creating various difficulties for both programmers and users.

Fragmentation[edit]

The primary concern of the original engineers appears to have been fragmentation – that is, the repeated allocation and deallocation of memory through pointers leading to many small isolated areas of memory which cannot be used because they are too small, even though the total free memory may be sufficient to satisfy a particular request for memory. To solve this, Apple engineers used the concept of a relocatable handle, a reference to memory which allowed the actual data referred to be moved without invalidating the handle. Apple's scheme was simple – a handle was simply a pointer into a (non-relocatable) table of further pointers, which in turn pointed to the data.[2] If a memory request required compaction of memory, this was done and the table, called the master pointer block, was updated. The machine itself implemented two areas in memory available for this scheme – the system heap (used for the OS), and the application heap.[3] As long as only one application at a time was run, the system worked well. Since the entire application heap was dissolved when the application quit, fragmentation was minimized.

The memory management system had weaknesses; the system heap was not protected from errant applications, as would have been possible if the system architecture had supported memory protection, and this was frequently the cause of system problems and crashes.[4] In addition, the handle-based approach also opened up a source of programming errors, where pointers to data within such relocatable blocks could not be guaranteed to remain valid across calls that might cause memory to move. This was a real problem for almost every system API that existed. Because of the transparency of system-owned data structures at the time, the APIs could do little to solve this. Thus the onus was on the programmer not to create such pointers, or at least manage them very carefully by dereferencing all handles after every such API call. Since many programmers were not generally familiar with this approach, early Mac programs suffered frequently from faults arising from this.[5]

Palm OS and 16-bit Windows use a similar scheme for memory management, but the Palm and Windows versions make programmer error more difficult. For instance, in Mac OS, to convert a handle to a pointer, a program just de-references the handle directly, but if the handle is not locked, the pointer can become invalid quickly. Calls to lock and unlock handles are not balanced; ten calls to HLock are undone by a single call to HUnlock.[6] In Palm OS and Windows, handles are an opaque type and must be de-referenced with MemHandleLock on Palm OS or Global/LocalLock on Windows. When a Palm or Windows application is finished with a handle, it calls MemHandleUnlock or Global/LocalUnlock. Palm OS and Windows keep a lock count for blocks; after three calls to MemHandleLock, a block will only become unlocked after three calls to MemHandleUnlock.

Starcatcher mac os. Addressing the problem of nested locks and unlocks can be straightforward (although tedious) by employing various methods, but these intrude upon the readability of the associated code block and require awareness and discipline on the part of the coder.

Memory leaks and stale references[edit]

Awareness and discipline are also necessary to avoid memory 'leaks' (failure to deallocate within the scope of the allocation) and to avoid references to stale handles after release (which usually resulted in a hard crash—annoying on a single-tasking system, potentially disastrous if other programs are running).

Switcher[edit]

The situation worsened with the advent of Switcher, which was a way for a Mac with 512KB or more of memory to run multiple applications at once.[7] This was a necessary step forward for users, who found the one-app-at-a-time approach very limiting. Because Apple was now committed to its memory management model, as well as compatibility with existing applications, it was forced to adopt a scheme where each application was allocated its own heap from the available RAM.[8]The amount of actual RAM allocated to each heap was set by a value coded into the metadata of each application, set by the programmer. Sometimes this value wasn't enough for particular kinds of work, so the value setting had to be exposed to the user to allow them to tweak the heap size to suit their own requirements. While popular among 'power users', this exposure of a technical implementation detail was against the grain of the Mac user philosophy. Apart from exposing users to esoteric technicalities, it was inefficient, since an application would be made to grab all of its allotted RAM, even if it left most of it subsequently unused. Another application might be memory starved, but would be unable to utilize the free memory 'owned' by another application.[3]

While an application could not beneficially utilize a sister application's heap, it could certainly destroy it, typically by inadvertently writing to a nonsense address. An application accidentally treating a fragment of text or image, or an unassigned location as a pointer could easily overwrite the code or data of other applications or even the OS, leaving 'lurkers' even after the program was exited. Such problems could be extremely difficult to analyze and correct.

Switcher evolved into MultiFinder in System 4.2, which became the Process Manager in System 7, and by then the scheme was long entrenched. Apple made some attempts to work around the obvious limitations – temporary memory was one, where an application could 'borrow' free RAM that lay outside of its heap for short periods, but this was unpopular with programmers so it largely failed to solve the problems. Apple's System 7 Tune-up addon added a 'minimum' memory size and a 'preferred' size—if the preferred amount of memory was not available, the program could launch in the minimum space, possibly with reduced functionality. This was incorporated into the standard OS starting with System 7.1, but still did not address the root problem.[9]

Virtual memory schemes, which made more memory available by paging unused portions of memory to disk, were made available by third-party utilities like Connectix Virtual, and then by Apple in System 7. This increased Macintosh memory capacity at a performance cost, but did not add protected memory or prevent the memory manager's heap compaction that would invalidate some pointers.

32-bit clean[edit]

MacOS Big Sur

Originally the Macintosh had 128 kB of RAM, with a limit of 512 kB. This was increased to 4 MB upon the introduction of the Macintosh Plus. These Macintosh computers used the 68000 CPU, a 32-bit processor, but only had 24 physical address lines. The 24 lines allowed the processor to address up to 16 MB of memory (224 bytes), which was seen as a sufficient amount at the time. The RAM limit in the Macintosh design was 4 MB of RAM and 4 MB of ROM, because of the structure of the memory map.[10] This was fixed by changing the memory map with the Macintosh II and the Macintosh Portable, allowing up to 8 MB of RAM.

Because memory was a scarce resource, the authors of the Mac OS decided to take advantage of the unused byte in each address. The original Memory Manager (up until the advent of System 7) placed flags in the high 8 bits of each 32-bit pointer and handle. Each address contained flags such as 'locked', 'purgeable', or 'resource', which were stored in the master pointer table. When used as an actual address, these flags were masked off and ignored by the CPU.[4]

While a good use of very limited RAM space, this design caused problems when Apple introduced the Macintosh II, which used the 32-bit Motorola 68020 CPU. The 68020 had 32 physical address lines which could address up to 4 GB (232 bytes) of memory. The flags that the Memory Manager stored in the high byte of each pointer and handle were significant now, and could lead to addressing errors.

In theory, the architects of the Macintosh system software were free to change the 'flags in the high byte' scheme to avoid this problem, and they did. For example, on the Macintosh IIci and later machines, HLock() and other APIs were rewritten to implement handle locking in a way other than flagging the high bits of handles. But many Macintosh application programmers and a great deal of the Macintosh system software code itself accessed the flags directly rather than using the APIs, such as HLock(), which had been provided to manipulate them. By doing this they rendered their applications incompatible with true 32-bit addressing, and this became known as not being '32-bit clean'.

In order to stop continual system crashes caused by this issue, System 6 and earlier running on a 68020 or a 68030 would force the machine into 24-bit mode, and would only recognize and address the first 8 megabytes of RAM, an obvious flaw in machines whose hardware was wired to accept up to 128 MB RAM – and whose product literature advertised this capability. With System 7, the Mac system software was finally made 32-bit clean, but there were still the problem of dirty ROMs. The problem was that the decision to use 24-bit or 32-bit addressing has to be made very early in the boot process, when the ROM routines initialized the Memory Manager to set up a basic Mac environment where NuBus ROMs and disk drivers are loaded and executed. Older ROMs did not have any 32-bit Memory Manager support and so was not possible to boot into 32-bit mode. Surprisingly, the first solution to this flaw was published by software utility company Connectix, whose 1991 product MODE32 reinitialized the Memory Manager and repeated early parts of the Mac boot process, allowing the system to boot into 32-bit mode and enabling the use of all the RAM in the machine. Apple licensed the software from Connectix later in 1991 and distributed it for free. The Macintosh IIci and later Motorola based Macintosh computers had 32-bit clean ROMs.

It was quite a while before applications were updated to remove all 24-bit dependencies, and System 7 provided a way to switch back to 24-bit mode if application incompatibilities were found.[3] By the time of migration to the PowerPC and System 7.1.2, 32-bit cleanliness was mandatory for creating native applications and even later Motorola 68040 based Macs could not support 24-bit mode.[6][11]

Object orientation[edit]

The rise of object-oriented languages for programming the Mac – first Object Pascal, then later C++ – also caused problems for the memory model adopted. At first, it would seem natural that objects would be implemented via handles, to gain the advantage of being relocatable. These languages, as they were originally designed, used pointers for objects, which would lead to fragmentation issues. A solution, implemented by the THINK (later Symantec) compilers, was to use Handles internally for objects, but use a pointer syntax to access them. This seemed a good idea at first, but soon deep problems emerged, since programmers could not tell whether they were dealing with a relocatable or fixed block, and so had no way to know whether to take on the task of locking objects or not. Needless to say this led to huge numbers of bugs and problems with these early object implementations. Later compilers did not attempt to do this, but used real pointers, often implementing their own memory allocation schemes to work around the Mac OS memory model.

While the Mac OS memory model, with all its inherent problems, remained this way right through to Mac OS 9, due to severe application compatibility constraints, the increasing availability of cheap RAM meant that by and large most users could upgrade their way out of a corner. The memory was not used efficiently, but it was abundant enough that the issue never became critical. This is ironic given that the purpose of the original design was to maximise the use of very limited amounts of memory. Mac OS X finally did away with the whole scheme, implementing a modern sparse virtual memory scheme. A subset of the older memory model APIs still exists for compatibility as part of Carbon, but maps to the modern memory manager (a thread-safe malloc implementation) underneath.[6] Apple recommends that Mac OS X code use malloc and free 'almost exclusively'.[12]

References[edit]

  1. ^Hertzfeld, Andy (September 1983), The Original Macintosh: We're Not Hackers!, retrieved May 10, 2010CS1 maint: discouraged parameter (link)
  2. ^Hertzfeld, Andy (January 1982), The Original Macintosh: Hungarian, archived from the original on June 19, 2010, retrieved May 10, 2010CS1 maint: discouraged parameter (link)
  3. ^ abcmemorymanagement.org (December 15, 2000), Memory management in Mac OS, archived from the original on May 16, 2010, retrieved May 10, 2010CS1 maint: discouraged parameter (link)
  4. ^ abHertzfeld, Andy, The Original Macintosh: Mea Culpa, retrieved May 10, 2010CS1 maint: discouraged parameter (link)
  5. ^Apple Computer (October 1, 1985), Technical Note OV09: Debugging With PurgeMem and CompactMem, retrieved May 10, 2010CS1 maint: discouraged parameter (link)
  6. ^ abcLegacy Memory Manager Reference, Apple Inc., June 27, 2007, retrieved May 10, 2010CS1 maint: discouraged parameter (link)
  7. ^Hertzfeld, Andy (October 1984), The Original Macintosh: Switcher, retrieved May 10, 2010CS1 maint: discouraged parameter (link)
  8. ^Mindfire Solutions (March 6, 2002), Memory Management in Mac OS(PDF), p. 2, retrieved May 10, 2010CS1 maint: discouraged parameter (link)
  9. ^'System 7.1 upgrade guide'(PDF). Archived from the original(PDF) on March 4, 2016. Retrieved May 26, 2015.
  10. ^'memory maps'. Osdata.com. March 28, 2001. Retrieved May 11, 2010.CS1 maint: discouraged parameter (link)
  11. ^Apple Computer (January 1, 1991), Technical Note ME13: Memory Manager Compatibility, retrieved May 10, 2010CS1 maint: discouraged parameter (link)
  12. ^Memory Allocation Recommendations on OS X, Apple Inc, July 12, 2005, retrieved September 22, 2009CS1 maint: discouraged parameter (link)

External links[edit]

  • Macintosh: ROM Size for Various Models, Apple Inc, August 23, 2000, retrieved September 22, 2009CS1 maint: discouraged parameter (link)

Take Me Away Acoustic

Retrieved from 'https://en.wikipedia.org/w/index.php?title=Classic_Mac_OS_memory_management&oldid=1008965847'

After being on computers for 25 years, I did something I've never done before: I purchased a Mac. For the last 10 or so years, I've owned PCs or laptops that were setup to boot to Windows or some flavor of Linux. That flavor of Linux has been Ubuntu for a long time. Until 12.04, and Unity came along, this was a pretty happy experience. I stuck in the 10.x versions, only dabbling in the 11.x versions until 12.04 came out. I've written many blog posts regarding Ubuntu and the use of Ubuntu over the years. I've even written open source software for it. You could even say I was a fanboi for a while, trying to push all my friends and family away from the evil of Mac or Windows to the open beauty that was Ubuntu.

Take Me Away Mac Os X

Over my years of computing, I too had developed a completely biased sense against the Apple ecosystem. The oft-touted cries of 'walled garden' or 'my device, my rules' or 'locked into Apple' arguments were valid to me. I couldn't, for the life of me, understand why people were paying so much money for something that seemed so obviously wrong. My mind has been changed and I would like to share the story of how it happened since I'm much, much happier for it.

Here's a screenshot of the very last time I booted into my Ubuntu installation to push some files onto an external drive:

This is representative of the constant, buggy struggle that Ubuntu became for me. All I had was a dual monitor setup on an NVIDIA card with an Intel chipset. Nothing particularly special or weird, it was a rig I had built to play Battlefield 3 back when I used to still be a gamer. I decided that, in addition to my laptop being a dual-boot machine, I needed my PC to dual-boot since I would be working from home.

The Struggle

The first struggle came when trying to setup a wireless USB adapter. I was able to find a driver and fire up the abomination that was NDISwrapper. Unfortunately, I had to end up Googling around for hours to find a solution consisting of modifying the driver itself before the USB adapter would work. Once it was actually working, it would just randomly stop every once in a while. This never occurred in Windows. This required me to remove and re-insert the USB adapter, constantly. I actually moved my PC onto my desk instead of on the floor because I got sick of bending over to take care of this.

Next, came the display. Ubuntu, for some reason, labeled my two monitors as 'Laptop' and treated it as a single screen. This meant I could not use the regular display configuration. I was forced to use the NVIDIA display configuration utility. This also meant I constantly struggled with apps not knowing how to go to a full screen properly, weird issues dragging windows around, and other oddities.

Onto the window manager. Beyond the frequent crashing for no particular reason, there were constant glitches. Leave the computer for a while and come back? Title bars for windows would become glitchy and unreadable. Restoring a window from being minimized? Sometimes it'll just be white. Go ahead and minimize and restore it again to fix. Icons randomly disappearing from the dock, requiring a restart of Unity? Yup, pretty consistent there too. Not only this, but the experience felt laggy. On the beefy machine it was running on, I expected the performance to be smooth and responsive but it was quite the opposite.

Unfortunately, none of these bugs were consistent enough to recreate reliably which meant I just had to deal with it until it because so infuriating that I would have to find a fix.

I decided enough was enough and Unity was not going to work for me. At the login screen, it's possible to select GNOME instead of Unity so I gave that a shot, assuming going back to GNOME would work. As soon as it booted in I was greeted by a monitor that didn't work and a window full of error messages. After Googling around for a while, I found some configuration changes to make in my xorg.conf file and was able to actually get it working. I was met with even more errors and problems so I decided it wasn't going to work for me. I decided to switch back into Unity.

When I came back? All of my settings were gone. All of my changes to use a sane Alt-Tab in the CompizConfig Settings Manager, my keyboard shortcuts, everything. I was back to what it was when I first installed. Extremely frustrated, I decided to give the Mint side of things a try and give Cinnamon a shot. Again, a few weird problems, but got that running as well. Cinnamon didn't quite fit the bill either. I ran into a display issue or two and found myself actually missing a few things from Unity so I decided I was going to dig in and really give Unity a shot. I didn't want to jump ship to an entirely different flavor of Linux because I had already invested so many years in getting used to the Ubuntu experience.

When I purchased a printer for my computer? Of course Ubuntu had no idea what to do with it. Of course there was a run around necessary to get it working. Even the mouse had problems. My old Logitech MX500, for one reason or another, would spam the logs in dmesg whenever I was using the scroll buttons on it. Sound would skip while listening to music using anything Flash or HTML 5 related like Grooveshark or Pandora. The whole system would lock up occasionally pegging a quad core CPU for no reason at all. Sometimes, it would just crash entirely.

How I Want to Spend My Time

When I'm at a computer, its because I want to get things done. Gone are the days where I have time to tinker around and spend countless hours Googling for some obscure mail archive to find I need to change 'bop' to 'boop' in /etc/something/config.ini. The amount of time that I had to spend doing this crap was growing instead of shrinking. This is not a good direction for an operating system to go.

Over the years, I've developed enough acumen to get a lot done in short periods of time. I've found that I work in extremely productive bursts. This means, when I'm ready to get down to business: I'm ready to get down to business. I don't want anything getting in my way. The glitches I had experienced in previous versions of Ubuntu were ones I could fix, get out of the way, and not have to worry about again. They were re-produceable, identifiable, and the fixes worked for me.

With Unity and 12.04, the glitches were random, weird, didn't offer any useful information, and were downright annoying. Some fixes would work for a while then stop working. Some bugs, like the aforementioned blank window, I simply couldn't figure out after a couple hours of Googling so I just got used to them as best I could.

Hours Googling, being frustrated, and being bumped out of the zone due to random glitches was no longer acceptable for me.

Making the Switch

I knew I had to make a switch. At my most recent job, I was given an iPhone 3G (at a time when the 4 was new) and it was the first Apple product I'd owned. It was an okay device but it was a hand-me-down and I was much more impressed by the 4. By the time the iPhone 4S came around, I was eligible for an upgrade. I decided to take the plunge and it literally changed something inside me. My immediate thought after experiencing the device was: 'I want to build things for this.'

Exploring many options for iOS development, I looked into building a Hackintosh for a while until I realized it wasn't going to be as stable an experience as I wanted. Since I could still get the development done I needed to and had recently built a gaming rig, I couldn't justify the switch. So I just dreamed of eventually having some spare dough around to drop on a Mac, but wasn't terribly serious about buying one.

Fast forward a year or two and it was Christmas time. I wanted to get something nice for myself that I would enjoy. I struggled back and forth again over justifying the cost for dropping into the Mac ecosystem. Back and forth I went until I decided, yet again, I couldn't quite justify the switch. I bought myself a New iPad ensuring I would be able to return it if I didn't like it. Of course, I loved it. My wife gave up her Kindle Fire usage and we shared the iPad. It was incredibly powerful, had a beautiful screen, and was light years beyond any tablet experience in terms of responsiveness, design and construction. I had another 'I want to build things for this' moment.

Now the desire for iOS development was getting stronger. The MBP Retina came out and I was absolutely drooling over it. I wanted one so bad, but, I still couldn't justify the cost. 'I'm not building anything in iOS, yet. Maybe I'll hate the OS and be stuck with a $2,500 bad decision. Walled garden. Non-customizable.' Those were the thoughts that were keeping me from taking the plunge.

Taking the Plunge

Eventually, I was sick and tired of not being able to spend time developing in the zone due to random glitches and small problems. I didn't want to spend hours or days finding solutions. In short:

I was tired of spending time on my computer working on my operating system instead of working on my projects.

I carefully considered all my options. We don't have an actual Mac store here, so I didn't have the pleasure of being able to identify, play with, and choose the right Mac for me. I had to go on specs, the advice of others, and my gut instinct.

I wanted the MBP retina, of course. My friend got one and it was an absolutely incredible machine. The design, the responsiveness of the SSD architecture, the retina display; it was beautiful and I was jealous.

See Full List On Support.apple.com

Since it was my first foray into the Mac environment, I didn't want to be disappointed. I decided that I would be much happier if I purchased something on the lower end, in case my pre-conceived notions about the OS were correct. I had a choice between the Air, the MBP and the Mini. My local store was constantly out of the upgraded versions and Apple does not ship here.

I decided the base MBP would be the best decision for me. The reason being: I had no idea where I was going to feel a bottleneck on the OS during my development. Would I really be CPU bound? Would not having SSD really slow things down that much? I had no previous experience so I had no idea if those things were worth it. The MBP was upgradeable, so I could fix anything I perceived as a shortcoming in my experience. With the Air I was locked in and I didn't know enough to know whether that was okay with me. The store was constantly out of the upgraded Mac Mini version, so I settled on the last, base 13″ MBP they had in the store.

I was elated bringing the box home. The unboxing was, of course, elegant and easy as my previous Apple products had been. The physicality of the product was awesome. I turned it on, went through the simple configurations, and was up and running pretty quickly. While waiting for the initial setup to complete, I started reading about the various things I could do with my new OS. I started reading about the trackpad and the gestures that were possible. I checked out some things that were 'must install' for every user and started to make a list of the OS X apps I had always wanted to try.

Welcome to OS X

The operating system came up and it was beautiful. The responsiveness, the elegance, and the simplicity were awesome. Being a consistent hater of trackpads over the years, it was one of the first things I played with. I had read reviews of it being incredible before and they were not exaggerating in the slightest. The gestures make sense and are very useful. The trackpad itself worked really well and wasn't constantly triggered by my thumbs accidentally brushing them.

Then I started to dive into the operating system. What I found was absolutely shocking: it was far more customizable than I had ever dreamed. Want to move the dock around? Sure, go ahead. In Ubuntu? Nope. Want to change how the mouse scroll wheel works? There's a program someone wrote for that. Every tiny adjustment I wanted was available either directly in the OS or through the installation of a simple program.

Ubuntu had introduced the ability to launch a program or find something from the dash and I had started to like it, even though it was buggy and extremely slow in a lot of cases. Spotlight? Completely blows it away. It's fast, responsive, and gets me to the program or thing I'm looking for every time.

Every device I hooked up to the machine worked flawlessly. Printer? Plug it in, it finds what you need, and you're good to go. Monitor? Plug it in and it recognizes it correctly and makes it available to start working right away. External hard drive? I plugged it in and it immediately asked me if I wanted to start using it for backups. It's like the OS knew what I wanted and was eager to please every step of the way instead of the struggle I was used to. I didn't have to tell it, nay smash it with a hammer in the face to force it, to get it to work. In other words:

OS X delivered the experience I wanted Linux to, and more.

What's My PC Up to Now?

My PC now sits on the floor under the desk. Beyond grabbing some files off of it: it's dormant. I keep it around in case I need to boot up Windows and test something or if I get the itch to start gaming again. But, I am now a 100% Mac convert.

Mac OS Compatibility Guide By Mac Model - MacSales.com

Away

The hardware is great. The OS is a constant pleasure. All my time that I want to spend developing or doing things is actually spent developing or doing things instead of the constantly interrupted, buggy experience I had before. Because it's Unix-based, everything is familiar or easy to learn since I spend most of my time in a terminal.

Instead of finding a brand new and unfamiliar experience, I found the experience I was looking for Linux to be: a great and consistent environment for me to get things done.

Liked this post? You should follow me on Twitter @randomdrake or subscribe to my feed.

Related Posts





broken image