Click here to Skip to main content
15,943,799 members
Articles / Mobile Apps / iPhone

Low-level text rendering

Rate me:
Please Sign up or sign in to vote.
5.00/5 (5 votes)
14 Sep 2010CPOL9 min read 76.9K   953   12   3
Low-level text rendering in iPhone.

Image 1


I needed to position individual text characters in my app, and it was more involved than I thought it would be. So I'm chalking it up for posterity.

But I want to zoom out a bit. What's really important isn't knowing how to do a particular trick. It is your general approach to the generic problem of assimilating and implementing new technology.

First thing I do nowadays when I want to add something new to my app is start with a clean slate and get it working on that. It's like working of a clean desk on a Monday morning, or keeping a clean tool shed.

And I've even made my own clean slate -- CreatingXcodeProject.aspx. I ended up brushing up the template a bit -- you can download the new one here: 1prepgroundiphonedev.aspx.

The modifications I made are:

  • Tidying up the view controller's files so that iVars don't share the same name with their getter methods (I explain why this is a bad practice in my introductory article: 1prepgroundiphonedev.aspx).
  • Improving the test picture, and pulling it out as a method. It's a really useful method -- practically every time I am creating a bitmap, or positioning a view within a view, or sticking a layer on a layer, and I'm using transforms, I never get it right the first time -- so I need to know how far out I am. With my original test picture, it was sometimes completely off the screen, which was no help. So this one fires out radial rays from 0,0 to some stupid distance like 10,000; I have used thin wedges -- just drawing lines is messier as you have to specify a line thickness, and as you don't necessarily have any idea how many pixels represent one unit, you can't just say CGContextSetStrokeWidth(X, 1</span />.0</span />), as that might be too small to even be seen, or so big that it fills the screen.

Anyway -- with this scheme, you can just look at what's on your screen and infer the origin. Also, the colour goes from red to blue as we progress counterclockwise from the positive X axis. So we can infer orientation. And the white circle on the grey background gives us bounds. Origin, orientation, bounds. Sorted!

I put in a function that transforms a context's coordinate system, giving a Cartesian coordinate system with 0, 0 in the centre, and 1, 1 in the top right corner. Like it should be!

Image 2

So, I create a new project using this template, and I can just put code in the view's drawRect -- the coordinate system is set up as described above.

Getting text onto the screen is no problem -- I can just set:

[self drawTestShapeToContext: X
      inRect: wholeRect
      withText: YES ];

… and the template will stick some text on the screen. This is how it does it:

if</span /> (showText)
   CGAffineTransform tmX = CGContextGetTextMatrix(X);
   char</span />* text = "</span />Hello World!"</span />;
   CGContextSelectFont(X, "</span />Helvetica Bold"</span />, 
                       rect.size.width / 13</span />.0</span />, kCGEncodingMacRoman);
   CGContextSetTextDrawingMode(X, kCGTextFill);
   CGContextSetRGBFillColor(X, 0</span />.1f, 0</span />.3f, 0</span />.8f,  1</span />.0f);         
   //</span />CGAffineTransform xform = CGAffineTransformMake(
</span />   //</span />             1.0f,  0.0f,
</span />   //</span />             0.0f, -1.0f,
</span />   //</span />             0.0f,  0.0f   );
</span />   //</span />CGContextSetTextMatrix(X, xform);    
</span />   CGContextShowTextAtPoint(X, centre.x, centre.y, text, strlen(text));
   //</span />  Watch out! TextMatrix doesn't get restored
</span />   //</span />  when you restore the graphics context!
</span />   CGContextRestoreGState(X);
   CGContextSetTextMatrix(X, tmX);

(Commented out transform because I have flipped the coordinate system myself -- if you didn't do that, the text would appear upside down).

But this is no use to me. I need finer grained control. I need to know the bounds of each glyph. What I really want is a CGPath for each glyph. If you don't know about CGPaths, think PDF. Representing glyphs (and everything else where possible) as a sequence of geometrical moves; MoveTo(100</span />,100</span />) then draws an arc centered on (50,50), radius 50 from theta = pi / 2 to pi / 2 … etc., so that the result is device independent.

I generally start out clicking through Apple's help.

Part one -- the grizzly dance with Apple's help system

This next section is about using the Apple help. It's not pretty. If you want to skip straight to the solution, scroll down.

Now, what's the next step? Well, it would be wonderful if I could just Alt+ double-click on CGContextShowTextAtPoint and have the help say 'Sorry, core graphics doesn't have any function for what you want -- what you want is core text'. And then I click on 'core text' and get the answer. But the Apple help is not so smart.

This is a horrible experience every time for me. Even finding something again that I was looking at yesterday can be a mission.

Help -> Developer documentation -> Home icon -> iOS 4.0 developer library.

I can see 'Core Text' on the left pane if I scroll down. So I click on that. And I don't even know what core text is, but it sounds like the right thing.

Now I can see 'Core text programming guide' coming up on the right-hand panel, so I click on that. Now I look at the status bar, and it says:

iPhone OS 4.0 library -> Data management -> Strings, text, & fonts -> Core text programming guide -> Introduction

So I double click on 'strings, text & fonts' (I don't want to read the core text guide without first being aware of all of the other options): 'no documents found'. Great -- my help is broken. So I will have to pick it up from this point online.

Throw 'core text programming guide' into Google and get the same page up.

Looking at its status bar, it is completely inconsistent with what my local help was saying. Here I get:

IPhone Dev centre -> iOS reference library -> Framework -> Media layer -> Core text

Which seems to be missing the level at which the different available technologies are differentiated, which is what I'm looking for.

There is also a warning to say that these pages are not up-to-date with iOS 3.2. Well, at the time of writing, the current version of iOS is 4.0. So even the warning telling us that the information is out of date is itself out of date.

Anyway, I can find it just by clicking the link in the introduction paragraph, which takes me here:

Great! Finally we have got to the juicy stuff. 

I'm not irately trying to bash Apple here. I'm just pointing out that behind your sleek metallic black rectangle of Zen perfection is a lot of people scrabbling hard to collectively get it together. The help misleadingly carries the same gloss veneer, giving you the impression that it has been put together by some mastermind, and you, the common idiot, are thrashing around as a result of your own stupidity.

It's important to get some perspective. You need to start at ground zero, a.k.a. reality. And reality is that the help is a mess. Maybe this is a good thing -- maybe it means they are putting their focus on good code. I would rather good code badly documented than vice versa. Most likely their best brains are not wasted on patching up the help system.

Anyway, let's look at that link. How far can you get before everything turns into 'blah de blah blah BLAH'?

Technical documentation and instruction manuals always sap my will to live. Why does the world conspire thus!

Sure, it all makes perfect sense -- if you understand it already. But to a newcomer, it is a nightmare. Everything is being presented in the wrong order. The problem is, these help files are written by people who have been so exposed to the material that they have forgotten what it is like to learn it -- they have forgotten which questions come first.

I want to dive into some code. This is all too wishy-washy. I want to see some code! As do you probably…

Okay, to cut a long story short, I eventually found some Apple sample code -- through a mix of Googling 'core-text sample' and asking on IRC. Alt + double-clicking on various functions in this sample brought up help pages that link to a couple of other samples.

I like to store links like this at the top of my project:

This way you can open them straight out of Xcode.

OK, that is all the sample code the Internet is providing.

I needed some help on IRC to convert CoreTextArcCocoa from OSX to iOS. Although it does what I need (wrapping text around a circle), the way it is done is not at all transparent. So I just kept gutting it out, moving auxiliary functions into the main body, trimming and pruning until I had the minimum needed to put a glyph on the screen.

CoreAnimationText is very nicely written -- and contains one line that is very interesting:

path = CTFontCreatePathForGlyph(font, glyph, NULL);

Wonderful! Let's stick together!

Part two: How to put a glyph on the screen

Start with a C-style string, something like 'hello world'.

Now, an attributed string is something like 'hello world' with 'hello' in Times new Roman 14 point Italic, and ' world' in Courier underlined. Basically rich text, remember .RTF? Microsoft WordPad?

Okay, suppose for now you have your attrString (we will discuss how to create it out of a string later). Here is how to render it glyph by glyph:

CTLineRef line = CTLineCreateWithAttributedString( attStr ) ;
CFArrayRef runArray = CTLineGetGlyphRuns(line);

//</span /> for each RUN
</span />for</span /> (CFIndex runIndex = 0</span />; runIndex <</span /> CFArrayGetCount(runArray); runIndex++)
   //</span /> Get FONT for this run
</span />   CTRunRef run = (CTRunRef)CFArrayGetValueAtIndex(runArray, runIndex);
   CTFontRef runFont = 
     CFDictionaryGetValue(CTRunGetAttributes(run), kCTFontAttributeName);
   //</span /> for each GLYPH in run
</span />   for</span /> (CFIndex runGlyphIndex = 0</span />; 
        runGlyphIndex <</span /> CTRunGetGlyphCount(run); runGlyphIndex++) 
        //</span /> get Glyph & Glyph-data
</span />        CFRange thisGlyphRange = CFRangeMake(runGlyphIndex, 1</span />);
        CGGlyph glyph;
        CGPoint position;
        CTRunGetGlyphs(run, thisGlyphRange, &glyph);
        CTRunGetPositions(run, thisGlyphRange, &position);

        //</span /> Render it
</span />        {
             CGFontRef cgFont = CTFontCopyGraphicsFont(runFont, NULL);
             CGAffineTransform textMatrix = CTRunGetTextMatrix(run);
             CGContextSetTextMatrix(X, textMatrix);

             CGContextSetFont(X, cgFont);
             CGContextSetFontSize(X, CTFontGetSize(runFont));
             CGContextSetRGBFillColor(X, 1</span />.0</span />, 1</span />.0</span />, 1</span />.0</span />, 0</span />.5</span />);
             CGContextShowGlyphsAtPositions(X, &glyph, &position, 1</span />);
        //</span /> Get PATH of outline & stroke outline
</span />        {
             CGPathRef path = CTFontCreatePathForGlyph(runFont, glyph, NULL);
             CGMutablePathRef pT = CGPathCreateMutable();
             CGAffineTransform T = 
               CGAffineTransformMakeTranslation(position.x, position.y);
             CGPathAddPath(pT, &T, path);
             CGContextAddPath(X, pT);
             CGContextSetStrokeColorWithColor(X, [UIColor yellowColor].CGColor);
             CGContextSetLineWidth(X, atLeastOnePixel);
        //</span /> draw blue bounding box
</span />        {
             CGRect glyphRect = CTRunGetImageBounds(run, X, thisGlyphRange);
             CGContextSetLineWidth(X, atLeastOnePixel);
             CGContextSetStrokeColorWithColor(X, [UIColor blueColor ].CGColor);
             CGContextStrokeRect(X, glyphRect);
        //</span /> release things
</span />   }

Say in our previous example, 'hello world' was our attributed string. We convert it into a CTLine. There will be a function for printing out a CTLine, but for now, we keep digging.

We find it comprises two CTRun-s. 'hello' and ' world' Obviously, a run is a wodge of text sharing the same attributes.

So for each run, we go through -- glyph by glyph.

For each glyph, we get its CPath. We could have a ton of fun with that. But let's stay minimal for now and just stroke the outline yellow.

Now that we have dug to the bottom, we can see a host of higher-level functions in a new light.

//</span /> Render it
</span />CGFontRef cgFont = CTFontCopyGraphicsFont(runFont, NULL);
CGContextSetFont(X, cgFont);
CGContextSetFontSize(X, CTFontGetSize(runFont));
CGContextSetRGBFillColor(X, 1</span />.0</span />, 1</span />.0</span />, 1</span />.0</span />, 0</span />.5</span />);
CGContextShowGlyphsAtPositions(X, &glyph, &position, 1</span />);

This is just filling the paths, isn't it?

And this obviously just gets the bounding box for the CG path:

CGRect glyphRect = CTRunGetImageBounds(run, X, thisGlyphRange);

The one remaining piece of the puzzle: how to create the attributed string in the first place.

#define</span /> FONT @"</span />HelveticaNeue-Bold"</span />
#define</span /> FONTSIZE 0</span />.7</span />
#define</span /> TEXT @"</span />jkl"</span />
NSString* string = TEXT;

//</span /> A string is something like: 'hello world!'
</span />//</span /> An attributed string is something like 'hello world!' with 'Hello' in 
</span />//</span /> Times new Roman, 14pt and 'world' in this is the Courier Bold.
</span />//</span /> An RTF file is basically an attributed string.
</span />
CFAttributedStringRef attStr;
//</span /> get attributed-string from string
</span />{
   UIFont* font = [UIFont fontWithName:FONT size:FONTSIZE];
   CTFontRef ctFont = CTFontCreateWithName((CFStringRef)font.fontName, 
   //</span /> In our example we just set the font,
</span />   //</span /> and specify ' only use ligatures when essential '.
</span />   //</span /> Ligatures: on some fonts 'fit' comes out
</span />   //</span /> as only 2 glyphs; 'fi'+'t', because if 'f' and
</span />   //</span /> 'i' are drawn separately, they will tend
</span />   //</span /> to be close together and messy. 'fi' is a ligature
</span />   NSNumber* NS_0 = [NSNumber numberWithInteger:0</span />];
   NSDictionary *attributes = [NSDictionary dictionaryWithObjectsAndKeys:
       (id) ctFont, kCTFontAttributeName, //</span /> NSFontAttributeName
</span />       (id) NS_0, kCTLigatureAttributeName,  //</span /> NSLigatureAttributeName
</span />       nil];
   assert(attributes != nil);
   NSAttributedString* ns_attrString = [[NSAttributedString alloc] initWithString:string
   [ns_attrString autorelease];
   attStr = (CFAttributedStringRef) ns_attrString; 

As I am including all of the developer boilerplate in this article, I will point out that we are not quite done. The next step, once we have something drawing on the screen, is to test it. Try different fonts and sizes. Try drawing upside down. Try drawing from a point other than (0,0). Kick and punch it a little -- toughen it up. In doing this, I discovered that restoring the graphics context doesn't restore the text matrix. Which made me add a couple of crucial lines to my code:

CGAffineTransform textMatrix = CTRunGetTextMatrix(run);
CGContextSetTextMatrix(X, textMatrix);

Then --who knows? Depends on what you want. Maybe extract the functionality you need into methods and stick them in a new source file, so you can just drag it into your project.

P.S.: If you're trying this at home -- you need to add the core text framework and the following line to your .PCH:

#import</span /> <</span />CoreText/CoreText.h></span />

If anyone has useful tips for speeding up the research phase, please share! It took me hours to find the sample projects. I should have probably gone to the Apple website straight away -- there is an easy to find page listing all of their code examples: hmm... just checked, and it is junk:

Image 3


This article, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)

Written By
United Kingdom United Kingdom
I like to create and invent things. Recent projects include: A novel computer interface (Google for JediPad), a new musical system (, a speech typer (

Currently I am making some innovative musical instruments for iPhone/iPad

PS Currently stranded in Thailand with no money! Great! So, if you like my articles, please consider putting some coins in the box. If I have to sell my MacBook and teach in a school, that means No More Articles! PayPal is sunfish7&gmail!com. Steve Jobbs -- if you're reading this, pay me to fix your documentation.

Comments and Discussions

QuestionOverrated Pin
Member 124314473-Apr-16 1:55
Member 124314473-Apr-16 1:55 
GeneralMy vote of 5 Pin
Member 971202826-Dec-12 16:30
Member 971202826-Dec-12 16:30 
GeneralMy vote of 5 Pin
Patrick Pineapple6-Aug-12 2:03
Patrick Pineapple6-Aug-12 2:03 

General General    News News    Suggestion Suggestion    Question Question    Bug Bug    Answer Answer    Joke Joke    Praise Praise    Rant Rant    Admin Admin   

Use Ctrl+Left/Right to switch messages, Ctrl+Up/Down to switch threads, Ctrl+Shift+Left/Right to switch pages.