My Journey to Certified Technical Architect

Today I received the best email I have seen in a long time: “Congratulations!  You have successfully completed the Review Board Presentation…”

Passing the Certified Technical Architect (CTA) exam is the culmination of more than 3 years of dedication to studying the platform.  I was a late entrant to the program – I didn’t start working with Salesforce until I was already an Architect and Program Manager.  I started my career as an ERP developer (J.D. Edwards) and eventually technical lead at a Big 4 consulting firm.  But I had moved on in my career – no longer wanting to be a “package” guy.  In 2008 I was first exposed to – but really nothing more than designing an integration that took TWO WEEKS to integrate into the back office J2EE app I was building.  I thought SFDC looked interesting – but didn’t have time to really dive in.  Plus – I aspired to be an Enterprise Architect (not some lowly package guy!!).  I was way too busy building my system that took A YEAR to build… Salesforce seemed like small potatoes…

Enter 2011 – By way of chance I was thrown at the Nissan implementation (you may have seen Nissan highlighted at Dreamforce in 2012…)  I knew very little about SFDC at this point – but I knew about packages, integration, architecture, development, project management, etc.  Therefore I was thrust to fill a the vacant technical architect role on the Nissan account.  At this point I was still a bit resistant to going all in to SFDC.  (I’M NOT A PACKAGE GUY!!).  But after 14 hard months on the Nissan implementation, I was convinced.  The Nissan project was probably the most difficult project I have ever worked in my life (actually almost killed me with pneumonia!) – but as I look back on what we built with a very small Salesforce team (2-3 developers, 2-3 functional/admins) compared to the entire project team (literally hundreds of other resources) it was pretty apparent that the power of the cloud and the promise of Salesforce was not just hype.  Rolling off of the Nissan engagement in 2012 I was not ready to go back to general technology consulting.  I no longer aspired to be the “Enterprise Architect generalist”.  It was now all very clear – I set my eyes on obtaining the Certified Technical Architect badge.  (Yes – I am committed to being a lowly product guy again!!)

Where to start?  Let’s start at the very beginning – a very good place to start!

1) I started with Admin and Developer certification.  I had taken the formal DEV401 in a classroom – so I started in on the study materials.  I had heard that the ADM201 and DEV401 class overlap quite a bit and that if I were to take DEV401 I would be close to covering both certs.  So I studied for both exams. I primarily used the study guides (Admin Study Guide & Developer Study Guide) posted on the certification website as my studying curriculum.  I fortunately had access to the premier on-line training (AWESOME RESOURCE!) and I found a number of good practice exams online.  I passed Developer and Admin back to back heading into Dreamforce 2012 (where BTW I don’t take full credit but I take A LOT of credit for the successful Nissan implementation that was highlighted at Dreamforce…. more on that when I talk about my case study)

2) Once I had the Developer cert I wasn’t sure what direction to go.  I could go straight for architect – but I definitely didn’t feel ready.  There is just way too much ground to cover and getting the admin and dev certs really just helped expose the tip of the ice burg.  I could go straight for advanced dev – but I wasn’t writing code in my day job and it had been years since I was code slinging C or Java.  So therefore I was looking at Advanced Admin or the consulting certifications.  My day job at the time was working as a technical architect over a current sales cloud implementation and scoping another service cloud implementation.  So I decided to go for the consulting certs.  As an architect I wanted to understand all of the ins and outs of the functional side too.  The best architects are those who know how to properly leverage the out of the box features first and foremost.  So I set my sites on functional expertise (really for the first time in my career).  Once again the material in the study guides was invaluable (Sales Cloud & Service Cloud).  Without the premier training option I would have been at a total loss.  The most important thing to take away from this is: The Certified Technical Architect is expected to understand Sales Cloud, Service Cloud, and capabilities inside and out.  So the more project experience you can get the better.  I believe it would not have been possible to pass the CTA without the time I had spent studying the functional features.

3) By now I was rolling.  I carried 4 certifications and really had a broad foundation understanding of the platform.  What next?  Realizing that once I start down the road of CTA or DEV501 for good I would be moving away from the core platform features – so I wanted to put down my advanced admin before leaving the foray.  I was feeling real confident – so a quick study of the advanced admin study guide was all that was necessary to pass this exam.  I honestly felt that advanced admin was the easiest test for me – or maybe the one I was most prepared for through my previous studies.

4) So now I carried the 5 core certifications.  A lot of individuals carry all 5 of those.  And for the most part – if you have all 5 of the core certs – you really know the platform.  But this is where I wanted to separate myself from the good to the elite.  Time for the big guns.  I started after advanced developer and CTA studying.  Realize I am still in the middle of my advanced developer certification – which I will cover that in a later post.  I used the Premier Training curriculum in the study guide and have so far passed the multiple choice exam.  I still have yet to pass the written assignment.  (I will admit that I didn’t pass my first go round – and all I can say is: DON’T USE JAVASCRIPT!!!)

5) Now on to the CTA.  Part 1 – Self Assessment.  I judged myself a bit too harshly and actually failed the self-assessment the first time.  However I felt ready to at least try for the multiple choice – so I quickly retook the self-assessment and was a little more arrogant with my skills.  I will say that the self-assessment is really heavy on what I would consider more development skills which I did not feel were assessed through the rest of the process.  For example I was never asked to explain what JSON was in the multiple-choice or the Review Board – so as long as you know what it is and how to use it you are probably good.  Whether or not you are truly an EXPERT at JSON?  I’m not even sure what that means to be honest.

6) The CTA multiple choice was hard.  I did not think I passed.  But if you are good at taking multiple choice you can usually knock off one or two of the choices and guess at the others – so then the passing grade of 63% is really probably not that hard to achieve.  Later in the blog and in another post I will list all of my CTA study materials.  My take on the multiple-choice was that it was very heavy on true technical architecture.  Security, platform capabilities, performance, integration patterns, etc.  It is a very good indicator of whether you are ready to move to the review board (much more so than the self-assessment).

7) I’m feeling pretty good.  I have passed the CTA multiple choice.  Now its time to prepare for the review board.  I had an AMAZING reference in Nissan and a very interesting case study to present.  I felt very confident in my case study.  You really need to have leadership experience on a large and complex project.  The case study should touch on as many of the CTA components as possible… in 30 minutes max.  This is actually much harder than it sounds.  Packing the case study full of content but being able to articulate it clearly and effectively within the 30 minutes will take some very strong public speaking competency.  And honestly this is another huge aspect of the CTA: your communication and presentation skills.  Salesforce wants to ensure you can speak intelligently and effectively to ANY level of a customer organization from CxO to the off-shore development team.  IMHO – my case study was awesome.  Probably my greatest strength going into the review board as opposed to the specific technical breadth or depth that other CTA’s bring to the table.  Remember – I really only just started with Salesforce in depth in 2011.

8) OK – the hypothetical was HARD.  Not that any of the material was unfamiliar… actually it wan’t that difficult of a scenario.  What I found was that the volume of the hypothetical requirements was high.  You have 75 minutes to read, process, design, document, and prepare for the hypothetical presentation.  I could have probably spent 75 minutes just on the reading/processing and another 75 minutes on the design and another 75 minutes on the documentation, etc.  So trying to do this on the clock is VERY stressful.  Honestly its more like an IQ test in that way.  Because it is not only testing your knowledge, it is testing your ability to think under pressure.  This to me was THE MOST DIFFICULT PART OF ALL OF THE CERTIFICATIONS.

9) I thought my presentation went just OK.  I definitely didn’t like the limited amount of time I had to prepare – so some of the areas of my design were weak.  And boy do the judges like to sniff out weakness.  In my view the judges are there with a predisposition to fail you – not the other way around.  So when they find a loose thread they pull on it to see how far it goes.  And in one particular area (for me: portals) I was weak.  And boy did they sniff it out and prove pretty quickly that I did not know that part of the platform.  So I did not pass.

I received an email two weeks later – basically I did not pass – but they weren’t failing me yet.  I had passed all but one section (guess which one: security as it relates to portals) and that in order to achieve CTA you must not only have a passing grade, but you also must pass ALL OF THE SECTIONS.  Honestly that was news to me – but oh well.  Honestly looking back – I did NOT know portals and the specific security & data model implications of partner and HVCP solutions.  So the judges did their job well and looking back now I would absolutely agree at the time that I did not deserve to pass the review board.

10) Time passes….

11) More time passes….

12) Finally after almost 6 months I finally received an invitation to retake just the security portion of the exam.  This entailed another hypothetical scenario (did I mention this is the most difficult part).  And guess what – this one was MORE DIFFICULT!  I only had 20 minutes to read, process, design, document, and prepare for the hypothetical presentation.  ONLY 20 MINUTES!!  Wow – in some of the meetings at work it takes 20 minutes just to get started!!

However this time I was much more prepared for the specifics I felt they would be looking for in relation to security.  I had a much better understanding of partner and HVCP solutions.  I understood roles/profiles/permission sets inside and out.  I understood the sharing model inside and out.  Sharing rules, teams, territory mgmt, manual sharing, programmatic sharing, implicit sharing, all sorts of sharing.  I would honestly say that a detailed understudying of the sharing and security model of Salesforce was VERY IMPORTANT for the CTA.  Much more than understanding say – when to use standard vs custom controllers, etc.  If you don’t live and breath the sharing model and the role hierarchy you need not apply.

13) FINALLY!!!  I received my passing score today.  It had a list of what the panel considered where my strengths and weaknesses.  It also referenced an architect level release exam that I would need to complete to maintain the credential (news to me).  Begin celebration!!  Begin throwing away all of the white papers I have been carrying with me for the last 3 years.  Don’t worry… they are documented below for your (and my) reference.

So now what?  Well – this blog for one.  I refused to spend time declaring myself an expert in Enterprise Architecture without the certification.  But now that I have it – watch out!!  I will be writing a lot.  I have other plans too – but those are my secret.  For now.

Good luck to all of the aspiring CTAs out there!  This was by far the most difficult certification or recognition I have ever studied for.  It was almost a masters degree in and of itself.  Actually I probably studied for the CTA (including all of my other certs) more than my entire graduate degree.

Here it is: Greg’s non-official list of CTA study resources (no particular order)


– DEV401 or equivalent
– DEV501 or equivalent
– ADM201 or equivalent
– ADM301 or equivalent
– Sales Cloud Consultant Certification or equivalent
– Service Cloud Consultant Certification or equivalent
– Enterprise Technical Architecture – especially patterns for transversing from the cloud to a customers internal network
– Enterprise Business Architecture – especially identifying and managing stakeholders, business processes, and enterprise operating models
– How to talk to Salesforce (the different API options in and out)
– How to run a project (deep understanding and ability to articulate waterfall, iterative, and agile concepts)
– Lead Architect responsibilities including application life-cycle management, automated testing, continuous integration, etc
– Public Speaking
– Mobile Architecture Strategies and Differences
– Understanding of TCP/IP, SSL, x509, etc

White Papers

– Record Level Access: Under the Hood (one of my favorites – study it closely)


– How to Implement Single Sign-On with (Delegated Authentication)

Other blogs & resources

– All of the Technical Architect courses on the premier training salesforce portal


Other tips:

– Understand the security model and how to setup all of the different types of platform capabilities (Reports/Dashboards via Folders, Content via Libraries, Knowledge via Data Categories, Chatter via Groups, etc.)
– In the hypothetical scenario try to calculate basic volumes for  the numbers that they throw at you and any inferred data model that is designed.  Both of my hypotheticals dealt with inferred data volumes as opposed to explicitly defined data volumes
– Understand implicit sharing of the account to other objects as well as the fact that the account hierarchy does NOT implicitly grant any sharing across the account hierarchy
– Understand what happens to the role hierarchy when partner portal accounts are used (1-3 roles are appended underneath the internal account owners role)
– Understand how HVCP works and the sharing model (Sharing Sets, Sharing Groups, etc)
– I strongly recommend setting up a partner and customer community with all sorts of b2b accounts and b2c accounts and play around with the sharing features to fully vet them out
– Understand the detailed flows for OAuth, IdP init SAML, SP init SAML, and oAuth with SAML
– If you don’t fully grasp OAuth and SAML, setup your own identity provider and build out the solutions.  The light did not go off for me until I built it out myself.
– Review all of the content on
 Good Luck! Architect – Blog Series Overview

I have a set of blog articles I will be writing over the coming weeks and months (let’s be honest: years).  I wanted to list the outline of what I am planning.  I would love your feedback on which ones you would be interested in seeing sooner than later.  (These are listed in no specific order of priority…

  • Logical Architecture of
  • Overview of Administration/Customization/Coding
  • Capability Maturity Model
  • CoE Governance Framework
  • Master Data Management with
  • Environment Management
  • Managing Builds and Releases
  • Building a Salesforce Team
  • Auditing Data
  • Archiving Data
  • Logging
  • Integration Patterns in
  • Limits
  • Data Conversion Techniques
  • Pace Layering Releases in
  • Org Strategy
  • Compliance Considerations
  • Considerations for TOGAF Practitioners
  • Considerations for ITIL Practitioners
  • Configuration Workbook
  • Third Party Tools
  • Using the AppExchange
  • Building for the AppExchange
  • Content Management Solutions
  • Apex Architecture
  • Visualforce Architecture
  • Person Accounts
  • Translation Workbench
  • Multi-Currency
  • Partner Communities
  • Customer Communities
  • Delegated Administration
  • Workbench
  • ANT Migration Tool
  • Divisions
  • Deferred Sharing Calculations
  • Mutual SSL
  • Encryption
  • SMTP Relay
  • Security Scanner
  • Custom Settings
  • SSO Part 1: SAML SSO with OpenAM on EC2
  • SSO Part 2: Delegated Authenication
  • SSO Part 3: oAuth (with, without, and with both SAML and Delegated Auth)
  • SSO Part 4: Salesforce Identity
  • SSO Part 5: Integrating User Authorization
  • SSO Part 6: Facebook/Twitter Integration
  • SSO Part 7: Best Practices for User Management

So let me know which are interesting to you.  Otherwise they will come out in the order that I am putting together information for clients, building apps, or maybe even writing for my book.

2014 – Lofty Goals

When my free year on EC2 expired I realized that I needed to fork-lift my blog over to a new site.  I am so spoiled on SaaS and PaaS that I couldn’t stand the thought of dealing with my EC2 machine.  So I am officially moving to WordPress tonight and will slowly be migrating any relevant content over here.

I have decided to really focus on building some good content in 2014.   Next year I really want to get some brain food into the architecture space.  I have been very focused on my clients, my certifications, and launching my company (don’t forget about my family) in 2013.  But this year is coming to a close and now it is time to turn my thoughts and goals to next year.

So here they are (in no particular order):

– Attain my Certifications as a Technical Architect and Advanced Developer (completing my collection…)

– Become formally certified as an Enterprise Architect by obtaining TOGAF 9 Certification (Zachman in 2015!)

– Get this blog going for real (hoping to get an article or two on

– Get some content out on Pluralsight

– Put together my Book: Architecting Enterprise Solutions on

– Launch a few of my AppExchange Apps (Event Logger, Business Activity Monitor, Configuration Workbook, to name a few)

And thats in my spare time… don’t forget all the consulting (which currently pays the bills).  Throw in a full time job as a husband and daddy and double lab owner and I might be a bit busy.

So lets get going! IDE running on Eclipse 4.2 (Juno)

Quick tip:

When trying to install the IDE ( into Eclipse 4.2 (Juno) you will receive errors.  The fix is very simple.  Get a hold of the following JAR file: org.eclipse.update.ui_3.2.300.v20100512.jar and place it in your Eclipse ‘plugins’ directory.  This is especially easy if you have an older version of Eclipse.  I copied it from Indigo’s plugins directory into Juno’s plugins directory and I was immediately in business.

12/11/12 Update – After a LOT of problems with Juno I actually reverted back to Indigo.  Not sure what was going on but it eventually was so bad it wouldn’t load Eclipse at all.

Using the Web Services Connector to access the API

In this post we will be creating a client application that can communicate with and alter the data in your org.  This would be the building blocks for a custom Java on-premise app.  We are using the WSC open source project for connection to  This is a web service client stack that makes it easier to use the API.  Find it here:

Here is my configuration:

  • I am using Eclipse (Juno) on a Mac OS X (10.8.2).’
  • Java 1.6.0_37 (JDK 1.6 is required)


  • Create a project in eclipse.  I named mine: sfdc-wsc-enterprise
  • Log into your org and download the necessary WSDL file (Setup –> Develop –> API –> Enterprise WSDL)
  • Save the WSDL to the top level folder of your eclipse project: enterprise.wsdl
  • Download the WSC Jar.  I downloaded wsc_22.jar and placed it the top level folder of my project.
  • Issued the following command from the command line:
~/dev/sfdc-wsc-enterprise cloudguy$ java -classpath wsc-22.jar enterprise.wsdl enterprise.jar
  • This generates the enterprise.jar file and places it in your directory path.
  • Add the JARs to your project (Project –> Properties –> Java Build Bath –> Libraries)
  • Add a new package to your project: com.cloudpremise.samples.sfdcWscEnterprise
  • Add a new class, SimpleIntegration.class to the new project:
package com.cloudpremise.samples.sfdcWscEnterprise;
import com.sforce.soap.enterprise.Connector;
import com.sforce.soap.enterprise.EnterpriseConnection;
import com.sforce.soap.enterprise.SaveResult;
import com.sforce.soap.enterprise.sobject.Account;
public class SimpleIntegration {
static EnterpriseConnection connection;
  public static void main(String[] args) {
    ConnectorConfig conn = new ConnectorConfig();
    conn.setUsername("<your user id here>");
    conn.setPassword("<your pw + security token here>");
    try {
      connection = Connector.newConnection(conn);
      //create an account
      Account[] newAccounts = new Account[1];
      Account a = new Account();
      a.setName("New Account Name");
      SaveResult[] saveResults = connection.create(newAccounts);
      if (saveResults[0].isSuccess()) {
      System.out.println("Created Account with ID: " + saveResults[0].getId());
    } catch (ConnectionException e1) {

That’s it!  Now you can make calls directly into your org from Java.  You can use the WSC connector to simplify a lot of the work for connecting into your org.  Notice no use of Axis, Jax-WS, etc.  Just a simple download of the WSC and you will be off and running in no time.


new account screen shot APEX REST + oAuth

We are building an integration between and a custom iPhone application.  The system has the following high level requirements:

1) will stand up a custom API for communication with the iPhone.  Apex Rest web services will be used.

2) oAuth will be used for the iPhone to authenticate into

There were enough nuances in building the application that I thought this subject deserved its own post.  I use a few tools tools as a test client for the API:

1) Firefox RESTClient (  (I will illustrate this in the blog entry)

2) cURL ( (this is a little finicky, especially in windows)

3) Custom Java application (perhaps a later post can be on the client side.  It is just a test harness though)

Step 1 – Building the custom RESTful API

I am going to make the round-robin on this system as simple as possible.  So all this API will do is return a SOQL query of all accounts in the system.  The API is not bulkified or tested, so keep that in mind as you build your actual production application.


global with sharing class resttest1 {
global static List<Account> doGet()
      return Database.query('select AccountNumber from Account');

@RestResource – defines the endpoint that we will later use to hit the URL of the custom API.

@HttpGet – defines the function to be called when issuing an HTTP GET command from the client.

return (Database.query(….); – APEX rest will automatically handle the tokenizing of this list into JSON or XML format.

Upon saving (and successful compilation) the APEX class generates the RESTful endpoint.  That’s it!

Step 2 – Set Up to be an oAuth provider

External applications could utilize a session variable or oAuth to authenticate into the API.  I am using oAuth: I can’t determine any other way to authenticate myself using solely RESTful styles.  Session based authentication would require me to use the Enterprise WSDL and a SOAP client to login first.


After creating a remote access record, you are given your oAuth consumer key and oAuth consumer secret.  Those are required in the client application to authenticate.


That’s it – you are ready to connect to the web service.   Pretty easy!

Step 3 – Accessing the RESTful web service using RESTClient

I love using Firefox’s RESTClient Add-on.  It is a perfect debugging tool for issuing RESTful commands and processing the return.


The next step is to authenticate into SFDC.  Using RESTClient “POST” to the following URL:
  • where ABCDEF is the consumer key from above
  • where 1234567890 is the consumer secret above
  • where USERNAME@CLOUDPREMISE.COM is the user you want to log in as
  • where SFDCPW&TOKEN is the users password and security token appended together
Use the following HTTP Headers:
  • Accept: */* (Lack of this header returns the response in XML for some reason)
  • X-PrettyPrint: 1 (optional – will help you to read the response)

After POSTing this to, you should receive a response such as:


Take note of the following parameters in the response:

  • “instance_url” : “; (all future http requests would be made to this URL location)
  • “access_token” : “00DE000……” (all subsequent http requests should include this as the oAuth authentication token)
Now you are ready to call your web service.  Because the web service annotation was @HttpGet, you need to use a GET command:
  • URL=
  • Headers:
    • Accept: */*
    • X-PrettyPrint: 1 (Optional – If you want to be able to read the response)
    • Authorization: OAuth 00DE000….
Wa la – you have built a RESTful integration to

Using this method you can now:

  1. Write an actual production web service
  2. Write an actual client that can easily access the methods of the web service

It is very simple and very powerful.  Have fun!

ApexRest is generally new and I have found the documentation to be sub-par.  However, here is what I used to find my way around: (Official site) (Official documentation) (CloudSpokes has some great coverage) (Official forums) (Model Metrics has a good blog posting on oAuth with SFDC)