CommonSpot Custom Field Crazyness!

November 1, 2014 Leave a comment

We were recently tasked by the University of Wisconsin, Eau-Claire to produce a series of advanced custom elements to help their content contributors generate a Pinterest-style image grid.

There are three major components to the element:

  1. An image editor that allows contributors to upload an image and then pan/zoom/crop to a specific image size.
  2. An editor that allows contributors to overlay caption on top of the cropped image.
  3. Output of multiple cropped & captioned images into a responsive, pinterest-style display

Creating the Image Editor Custom Field

Starting from a codebase with limited pan/zoom cropping functionality (http://danielhellier.com/imagecrop/), we refactored the jQuery component to support fixed-sized crop areas as well as implement a bounding box and also tied in a slider component to enable easy-to-use zooming. We also added in the capability to transmit the scaling/cropping coordinates to an application server for server-side processing.

[Click here to play around with the proof-of-concept]

Adding Text Captioning by using a Draggable, Editable <h3> Element

For this project, we needed to give contributors the ability to place a text overlay on the image. The word “and” would need to be automatically converted to uppercase and have a style applied via CSS. To enable the users to choose a color, we used the spectrum plugin for jQuery which generally worked as advertised.

You can make editable <div> segments by simply adding a contenteditable attribute to a block element as illustrated below. The cropped image was simply placed behind the <h3> in the container

<div id="#fqfieldname#-container" style="width: #adjustedWidth#px; height: #adjustedHeight#px">
  <div id="#fqfieldname#-preview" style="border: 1px solid black">

	<h3 class="POAeditablelabel"
            id="edittext-#fqFieldName#" contenteditable
            onFocus="stripAnd(this)">#poaAttribs.text#</h3>

	<img id="#fqfieldname#-image"
	  style="width:#stParams.nWidth#px;
	  height: #stParams.nHeight#px;"
 	 <cfif poaAttribs.src is not ""> src="#poaAttribs.src#"</cfif>
        >

  </div>
</div>

We then applied jQuery UI’s draggable plugin to enable the user to drag the editable heading within it’s container:

	jQuery('##edittext-#fqFieldName#').draggable({ containment: 'parent', axis: 'y' })
  	.click(function() {
	  jQuery(this).draggable({ disabled: false });
	}).dblclick(function() {
	  jQuery(this).draggable({ disabled: true });
	});

Creating a Pinterest-Style Render Handler

Leveraging the jQuery isotope and jQuery lazyload plugins, Fig Leaf’s developers were able to devise an algorithm that would automatically resize images into a “pinterest”-style image grid in order to minimize the right margin space that typically results from implementing these types of layouts as well as work with the dyanamic text overlays required by the customer.

Every time the browser is resized, we dynamically rewrite the image CSS classes and then reinvoke the isotope plugin. The key javascript function, executed on browser resize, is illustrated here:

 

function generateStyles() {
 
 // get pointer to stylesheet			
 var ss = $("#POADynamicStyles"); 
 var container = $("#POAcontainer");
 var width = container.width();
 
// set minimum width of images to 160, maximum width to 201
 var minWidth = 160, maxWidth = 201, iWidth = 0;
				
				
 // figure out the optimal number of cols, given the available space
 for (var numCols=2; numCols<15; numCols++) {
   iWidth = Math.ceil(width / numCols);
   // console.log('cols', numCols, 'width', iWidth);
   if (iWidth <= maxWidth && iWidth >= minWidth) {
	break;
   }
  }
				
  if (numCols == 15) {
    numCols = 3;
    console.log('auto sizing failed');
  }
			
  // 10px margin around images
  var w1 = Math.floor(width / numCols) - 10;

  var reducePercentage = w1/250;
  var fontSize = reducePercentage * 2.7125;
  var lineHeight = 1;
  var heightOffset = 0;

  heightOffset = (-0.13 * (100 - (reducePercentage * 100)));

				
  var styles = ''.concat(
	'.item.w2  {width : {1}px;} \n',
	'.item.h2 {height: {1}px;} \n',
	'.item {width : {0}px; height: {0}px;}\n',
	'.item h3 {font-size: {2}rem; line-height: {3}; transform : translate(0px,{4}px)}'
   ).format(w1,w1*2+10,fontSize,lineHeight,heightOffset);
				
				
   ss.html(styles);
   initIsotope(w1 + 10);
			
}

[You can view an early proof-of- concept by clicking here].

Would you like to know more?

Contact Fig Leaf Software’s Professional Services group at info@figleaf.com to discover how we can help you achieve your CMS implementation goals.

Categories: ColdFusion, JavaScript

Case Study: NACCHO Model Practices

October 10, 2014 Leave a comment

The National Association of City and County Health Officials engaged Fig Leaf Software to develop a dynamic form/workflow/collaboration application using web standards. We used Sencha’s Ext JS 4 javascript framework for the front-end development and Microsoft .NET/SQL Services on the back-end to create a 3-tiered REST-based architecture that positions them for future growth.

The Challenge

NACCHO’s Model Practices Program honors and recognizes outstanding local health practices from across the nation and shares and promotes these practices among local health departments (LHDs). Model and promising practices cut across all areas of local public health, including, but not limited to, community health, environmental health, emergency preparedness, infrastructure, governmental public health, and chronic disease.

Once practices are designated as model or promising, they are stored in the Model Practices Database so all LHDs can benefit from them. NACCHO began accepting Model Practices submissions in 2003. Since then, NACCHO has placed numerous model and promising practices in the searchable online Model Practice Database with more added each year.

Since 2003, the collection and review of submissions had largely been a manual process. LHD’s entered data through a web form which was ultimately downloaded into either a Microsoft Excel spreadsheet or Microsoft Access database. Submissions went through a two-stage content review and collaboration process, often involving four or more reviewers whose actions were coordinated by NACCHO personnel and tracked manually through a series of excel spreadsheets. Due to growth of the program, this labor-intensive process was deemed to be unsustainable and Fig Leaf Software was called in to design and implement a workflow system that would automate the submission, review, collaboration, and publication cycle.

Here’s a flow chart that we created as part of our specification process that models the review cycle:

Workflow process model

Workflow process model (click to enlarge)

3-tier

Figure 1: 3-Tier Architecture

3-Tiered Architecture

A 3-tiered architecture segments your app into three distinct service layers:

  1. The User-Interface
    In the pre-smartphone days, organizations could settle on creating a single front-end that only supported desktop browsers. With the proliferation of mobile devices of all shapes, sizes, and capabilities, corporate IT must now consider developing multiple front-ends for their apps. At an absolute minimum, line-of-business apps should support desktop and tablet with reduced functionality available for phones. This requirement frequently requires that multiple front-end apps be developed in parallel, each accessing a common REST-based webservices api implemented at the business intelligence tier.We chose to develop the front-end using Sencha’s frameworks – Ext JS 4 and Sencha Touch for the following reasons:

    1. The toolkits are based on web-standards (javascript & html5)
    2. Sencha Toolkits use a well-defined client-side MVC architecture, helping to ensure coding standards among developer team members thereby leading to reduced future maintenance costs.
    3. Both frameworks have good tooling (Sencha Architect and Sencha Cmd)
    4. Consistency in the Desktop and Phone API’s means that we can rapidly develop a mobile phone GUI by repurposing the data model classes from the desktop GUI.
    5. Flexibility to upgrade in the future to the recently released Ext JS 5, which will enable us to support both desktop and tablet GUI’s from a single codebase.
    6. Ext JS 4 has full backwards compatibility with IE8 – an important consideration when we evaluated NACCHO’s target audience of municipal health departments.
      .
  2. The Business Intelligence Tier
    This tier marshals resources from  “back-office” resources – enterprise databases, CRM, mail servers, and more. NACCHO’s I.T. group is in the process of migrating from Adobe ColdFusion to a Microsoft .NET platform as their corporate standard. We honored their preferences by creating a rich REST-based webservices API that could be invoked from virtually any client-side technology that can parse data in JavaScript Object Notation (JSON) format. This architecture gives NACCHO the flexibility to publish their API so that third-parties could easily develop a custom front-end or mashup that leverages the model-practices data. At Fig Leaf, we strongly believe in “open government” and actively seek opportunities to make .gov data resources available to other developers.
    .
  3. The Database Tier
    NACCHO’s corporate standard is Microsoft SQL Server. We designed an efficient, normalized 20-table schema with referential integrity rules that enforce valid data input. Due to the dynamic nature of the application, MongoDB might have actually been a better choice from a development perspective – but since it wasn’t a corporate standard and we had already chosen .NET as the middleware, we went with “old reliable.” We use Microsoft Full Text indexing to drive the front-end keyword search.

Organizing Around Perspectives

NACCHO Model Practices has four different user roles:

  1. Casual browsers who want to search and retrieve Practices.
  2. Applicants who are submitting Practices for review
  3. Internal Reviewers on the NACCHO staff who parcel out Practices for review
  4. External Reviewers who review and comment on Practices as well as collaborate on forming an opinion as to whether a submission is a Model Practice, Promising Practice, or Neither
  5. Administrators who perform an initial review of submitted practices, create the submission form, create and run ad-hoc reports, and manage the overall review cycle.

To address the very different roles and responsibilities of the stakeholders, we organized the application around a series of “Perspectives”

The Browser Perspective

The Browser Perspective, as illustrated by Figure 2, enables users to easily apply filter criteria to search through NACCHO’s Model Practice database. The Ext JS 4 data grid automatically downloads records in the background as the user scrolls, reusing DOM elements on-the-fly to keep memory overhead at manageable levels. Search filters are applied automatically after a user stops typing in a field.

mpie8-1

Figure 2: The Browse Perspective running in Satan’s favorite browser (IE 8)

Users can resize and rearrange grid columns. In addition, all of the sections of the layout can be expanded or collapsed in order to maximize available space. These settings persist between a user’s sessions, enabling them to create a personalized interface that only shows the information that they find to be helpful.

The Applicant Perspective

The Applicant Perspective, as illustrated in Figure 3, enables logged-in users to edit, save, and submit a Model Practice for review. Ext JS’ rich form field widgets, customizable validation, and flexible layouts enable us to dynamically assemble the form at runtime based on instructions that are read from the server. We also implemented a “Print” feature that redraws the form in a printer-friendly format with hard page-breaks that separate each section.

Figure 3: The Applicant Perspective

Figure 3: The Applicant Perspective. Yes, this works in IE 8 too!

In order to facilitate the editing of large blocks of text, we created an Ext JS extension that integrates the best-in-class TinyMCE 4 WYSIWYG editor and added a previous/next buttons at the bottom of the screen to help users navigate through the different tabbed-based sections.

The Administrator Perspective

The Administrator Perspective, depicted in figure 4, uses roles-based security to restrict access to administrator/reviewer functionality by role.

“Super Admins”, of course, have full run of the system but are primarily responsible for the following tasks:

  1. Reviewing the initial submissions and assigning them to an “internal reviewer”, a subject matter expert within NACCHO, who reads through the practice and decides whether it has been properly categorized.
  2. Designing forms
  3. Creating and running reports
  4. Auditing the review cycle
  5. Managing accounts
Assigning an initial reviewer

Figure 4: Assigning an initial reviewer

Using the Form Builder Perspective

Since NACCHO’s survey form changes from year-to-year, Fig Leaf Software designed the Model Practices application to enable non-technical admins to customize their forms without involving I.T. Form fields can be grouped into tab panels and we support collecting data via text input fields, wysiwyg editor (TinyMCE 4), select boxes, checkboxes, and radio buttons. Admins can set data validation rules to require input on select fields, restrict text input by word count, and more!

Admins can build custom forms without involving I.T.

Figure 5: Admins can build custom forms without involving I.T.

Running Reports

We’ve implemented several query-by-example reporting tools into the Model Practices application. Admins can quickly identify the status of practices in workflows and run statistical roll-ups on approved documents. Using Sencha Ext JS 4, we were able to easily  present data in a scalable grid and display aggregate statistics in a native web chart. We also developed a custom extension that allows users to export the information in any grid to Microsoft Excel, as illustrated in figure 7.

Figure 7: Executing reports, charting the results, and exporting to Microsoft Excel

Figure 7: Executing reports, charting the results, and exporting to Microsoft Excel

Query EVERYTHING!

In rare cases, query-by-example interfaces might not be sufficient to enable administrators to extract the information that they require. To handle any reporting criteria that might come up in the future, we implemented the query builder, depicted in figure 8, that enables admins to create a dynamic filter for every field on any form. We’ll be posting the Query Builder code to GitHub before the end of the year.

The Query Builder enables admins to create and save custom reports.

Figure 8: The Query Builder enables admins to create and save custom reports.

 The Internal Reviewer Perspective

As illustrated in Figure 9, the Internal Reviewer’s job is to read through the submitted document and assign subject-matter experts (external reviewers) who will grade and judge the responses. If an Internal Reviewer decides that the submission has been incorrectly classified, they can reclassify it forward it back to the Administrator who, in turn, can pass it on to a different internal reviewer. This view uses a drag & drop, searchable grids that facilitate the assigning of external reviewers. We used a third-party extension, Ext.ux.grid.Filterbar, to define the “filter row” depicted in the “Search for Reviewers” grid control.

Internal reviewers read through the submissions and assign them to external reviewers for "grading"

Figure 9: Internal reviewers read through the submissions and assign them to external reviewers for “grading”

The External Reviewer Perspective

External reviewers are charged with reviewing the application and answering a series of targeted review questions that were defined in the Form Builder perspective. Responses that don’t meet the designated data validation criteria are denoted by a red [X] in the left-side tree control. Once they have completed commenting on applicant responses, they designate the application as being a “Model Practice,” “Promising Practice,” or “Neither.”

Figure X: Ranking and reviewing applicant responses

Figure 10: Ranking and reviewing applicant responses

The Reconciliation Perspective

If two or more external reviewers disagree as to whether an application is a “Promising Practice” or “Model Practice”, they are directed into the perspective, depicted in figure 11, where each reviewer can see all of the other reviewer’s responses and comments. They can also schedule a conference call from directly within the GUI to take place via VOIP (implemented by integrating the Twilio API) where they can hash out their differences. All external reviewers must ultimately reach consensus as to whether a submission is a “Model Practice”, “Promising Practice”, or “Neither”.

Figure: Comparing other reviewer's responses, scheduling a conversation, using VOIP

Figure 11: Comparing other reviewer’s responses, scheduling a conversation, using VOIP

Once the external reviewers reach consensus the applicants are notified via email of it’s final disposition and, if rated as a “Model Practice”, or “Promising Practice”, the application becomes accessible to the public on the web site.

Built using Sencha Architect

Sencha Architect, depicted in figure 12, enabled our development team to respond to changes in our customer’s requirements with agility as well as rapidly prototype and visualize new features. Using it’s deep integration with Sencha Cmd made it easy for us to create and post development, testing, and production builds.

Sencha Architect's visual designer enabled our development team to act with agility.

Figure 12: Sencha Architect’s visual designer enabled our development team to act with agility.

Futures – Mobile and More!

Model Practice Mobile Prototype

Model Practice Mobile Prototype

Using Ext JS for the project paid off handsomely when the customer asked us to develop a level-of-effort (LOE) for porting the search perspective over to a mobile-phone form factor. Using Sencha Architect and relying on the REST-based api that we produced during the desktop app development phase allowed us to create a quick proof-of-concept using Sencha’s Touch framework.

Would you like to know more?

Please contact us at info@figleaf.com to find out more about our custom application development services and how we can help you realize your visions of productivity enhancements across your enterprise in a cost-effective manner!

Categories: Ext JS 4

Now Hiring: Senior UX Developer / Practice Manager

August 5, 2014 Leave a comment

Fig Leaf Software seeks to expand its team by hiring a Senior UX Developer / Evangelist.

Successful candidate will have at least one year of experience developing amazing front-end application GUI’s using at least two of the following:

jQuery, jQuery Mobile, Sencha Touch, Sencha Ext JS, Angular JS, Appcelerator Titanium, Cordova 3.5, Native iOS development, Native Android development

Experience with .NET, ColdFusion, PHP, Node.JS and public speaking/technical blogging a plus. 

You will work directly with Fig Leaf’s founder/president, Steve Drucker, to help us expand and scale this growing practice area.

Send resume and sample screenshots to sdrucker@figleaf.com

 

Categories: Uncategorized

Update your Skillz with New Node.JS Courseware!

August 3, 2014 1 comment

Learn about Node.JS fast!!I’m proud to announce that I just finished authoring our new 1-day Node.JS Fundamentals course – weighing in at 110 pages with 9 hands-on exercises! This class will get you up-to-speed quickly on creating Node apps using the Express framework, dynamically constructing html output with Jade templates, and interacting with popular databases (MySQL, MongoDB, and CouchDB). It also covers socket I/O, File system access, and other exciting topics!

Check out the course outline and register today at:
http://training.figleaf.com/courses/nodejs100.cfm

For information about licensing this course or for a sample to evaluate, please contact me directly at sdrucker@figleaf.com!

 

Categories: Node.js

Sencha Touch 2.3: Downloading and Saving Large Binary Files to Local Devices with PhoneGap/Cordova

June 24, 2014 7 comments

Note: This article pertains to PhoneGap/Cordova 3.5

Sencha Touch 2.3 has added a new Cordova/PhoneGap abstraction layer to the Ext.Device class. Unfortunately the documentation/guides don’t seem to have an example of downloading and saving a binary file to the device’s filesystem.

Since mobile connectivity can be unreliable and usually produces high latency, being able to cache large documents on the device is critical for developing high-performance apps.

Before you can get started, you’ll need to install phonegap:

  1. Install Java JRE
  2. Install Node.JS
  3. Install PhoneGap by typing the following at a command prompt:
    npm install -g phonegap

Next, you’ll need to use Sencha Command 4.x+ to create a phonegap project. Simply change directories to the root of your application’s folder and issue the following command:

sencha phonegap init [AppId]

Where [AppId] is your application’s id (e.g. com.figleaf.myapp)

This will create a /phonegap folder off your project root.

Change directories to the phonegap subdirectory and install common phonegap plugins to enable the Sencha Touch Ext.Device classes to detect device type, enable file system access, and get real-time network information:

  • phonegap plugin add org.apache.cordova.device
  • phonegap plugin add org.apache.cordova.file
  • phonegap plugin add org.apache.cordova.network-information

Now you can get started with adding the download feature!

Request access to the local filesystem by using the code below, which I typically place in the application’s launch function, caching the result in the application object.

Ext.device.FileSystem.requestFileSystem({
    
    type: PERSISTENT,
    size: 50 * 1024 * 1024, // 50mb -- gonna store video
    success: function(fs) {
        MyApp.app.Fs = fs;
    },
    failure: function() {
        Ext.Msg.alert("FileSystem Error","Could not access the local file system<br>You will not be able to save notifications for offline use.");
    }
});

The following controller function illustrates how to download a file with a progress indicator and save it locally on the device. Note that the local file URL is returned to the callback function and would typically be written to a data store.

Also, note that the file had to be written out in chunks, otherwise the fileWrite() method would gpf the app somewhere north of a 10MB file size.

saveFile: function(url,recordId,prompt,callback) {

 var me = this;

 // create progress indicator
 var progIndicator = Ext.create('Ext.ProgressIndicator', {
    loadingText: prompt + " {percent}%",
    modal: true
 });

 // create unique filename
 var fileName = url.split('/');
 fileName = fileName[fileName.length - 1];
 fileName = "notification-" + recordId + '-' + fileName;

 // let's get the party started!
 Ext.Ajax.request({

    url: url,
    timeout: 180000,
    useDefaultXhrHeader: false,
    method: 'GET',
    xhr2: true,
    progress: progIndicator,
    responseType: 'blob',
    success: function(response) {
        
        Ext.defer(
           function(p) {
               p.destroy();
           },
           250,
           this,
           [progIndicator]
       );

       // define file system entry
       var fe = new Ext.device.filesystem.FileEntry("/" + fileName, MyApp.app.Fs);

       fe.getEntry({
               
                file: fileName,
                options: {create: true},
                success: function(entry) {

                    console.log('entry',entry);

                    fullPath = entry.nativeURL;
                    console.log(fullPath);

                     Ext.Viewport.setMasked({xtype:'loadmask'});
                    
                     entry.createWriter(function(fileWriter) {

                         // write data in blocks so as not to
                         // gpf iOS

                         var written = 0;
                         var BLOCK_SIZE = 1*1024*1024;
                         var filesize = response.responseBytes.size;
                                           
                         fileWriter.onwrite = function(evt) {
                             if (written < filesize) {
                               fileWriter.doWrite();
                             } else {
                               Ext.Viewport.setMasked(false);
                               if (callback) { 
                                  callback.call(me,fullPath);
                               }  
                             }
                         };
                         
                         fileWriter.doWrite = function() {
                             
                             var sz = Math.min(BLOCK_SIZE, filesize - written);
                             var sub = response.responseBytes.slice(written, written+sz);
                             console.log('writing bytes ' + written + " to " + written+sz);
                             written += sz;  
                             fileWriter.write(sub);         
                         };
                         
                         fileWriter.doWrite();
                                  

                    });

                },
                
                failure: function(error) {              
                    Ext.Msg.alert("Transaction Failed (02)", error.message);
                }
        });

    },
    failure: function(error) {
       progIndicator.destroy();
       Ext.Msg.alert("Transaction Failed (03)", error.message);
    }

});

Note that while you’ll be able to test this code on device simulators, the Ext.device.filesystem.FileEntry.getEntry() method will fail if the app is run through desktop Chrome.

An alternative approach that we used for Android and Cordova 3.5 involved calling the fileTransfer API’s download method. To install the file transfer plugin, invoke the following command:

cordova plugin add org.apache.cordova.file-transfer

After installing the plugin, you can access the fileTransfer.download() method as illustrated by the following snippet:

function saveFile(url,recordId,prompt,callback) {

 var me = this;

 Ext.Viewport.setMasked({xtype:'loadmask'});

 var fileName = url.split('/');
 fileName = fileName[fileName.length - 1];
 fileName = "notification-" + recordId + '-' + fileName;


 var newFilePath = MyApp.app.Fs.fs.root.fullPath + fileName;

 MyApp.app.Fs.fs.root.getFile(
    "dummy.html", 
    {create: true, exclusive: false}, 
    function success(fe) {
         var sPath = fe.toURL().replace("dummy.html","");
         var fileTransfer = new FileTransfer();  
         fe.remove();
         fileTransfer.download(
           url,
           sPath + fileName,
           function(theFile) {
             Ext.Viewport.setMasked(false);
             if (callback) {
                callback.call(me,theFile.toURI());
             }
           },
           function(error) {
             Ext.Viewport.setMasked(false);
             Ext.Msg.alert('Failed',JSON.stringify(error));
           }
         );
    }
 );
}

Happy coding!

Categories: Sencha Touch 2

Using Routes with Sencha Touch Navigation Views

June 22, 2014 Leave a comment

Note: This post pertains to Sencha Touch 2.3.1

Using routes in Sencha Touch enables you to support the Android “back” button as well as allow for “deep linking” to a deeply nested view.

The only problem with using routes is that there’s not a lot of documentation or simple examples that describe integrating them with a Sencha Touch Navigation View, which is often used as the primary mechanism for navigating within small mobile phone apps.

Supporting routes is a three-step process:

1) Override the Ext.History class
It’s not just your imagination – it does appear as though the cards were stacked against you from the beginning. There’s actually a bit of a bug in the Ext.History class that prevents the history from dequeuing properly. To fix it, drop in this override:

Ext.define('MyApp.overrides.History', {
	override: 'Ext.app.History',

	back: function() {

        var actions = this.getActions(),
            previousAction = actions[actions.length - 2];

        if (previousAction) {
            
            actions.pop(); // pop current view

            // Added by Steve Drucker
            // need to pop previous view, because it will get reinstantiated on next line
            actions.pop(); 

            previousAction.getController().getApplication().redirectTo(previousAction.getUrl());
        }
        else {
            actions[actions.length - 1].getController().getApplication().redirectTo('');
        }
    }
});

2) Override the Default Navigation View Back Button

Ext.define('MyApp.controller.Main', {
    extend: 'Ext.app.Controller',

    requires: [
        'Ext.app.Route'
    ],

    config: {
        routes: {
            '#login': 'onLogin',
            '#forgotpassword': 'onForgotPassword',
            '': 'onHome',
            '#notificationList': 'onNotificationList'
        },

        refs: {
            main: 'main',
            appBackButton: 'main button[ui=back]' // target acquired
        },

        control: {
            "main": {
                show: 'onNavigationviewShow'
            },
            "appBackButton": {
                tap: function(button, e) {
                  var appHistory = this.getApplication().getHistory();

                  // fire previous route
                  appHistory.back();

                  // prevent the default navigation view
                  // back button behavior from firing
                  return false;

                }
            }
        }
    }
});

3) Pop Back to Previous Views in your Route Handlers
Since you disabled the default “pop” action in step 2, you’ll need to deal with this in your route handlers by following the pattern illustrated in the following snippet:

onNotificationSelect: function(id) {
 
 var record = Ext.getStore('Notifications').getById(id);
 
 // if view does not exist, instantiate it
 if (Ext.ComponentQuery.query('notificationdetail').length == 0) {

    this.getMain().push({
        xtype: 'notificationdetail',
        title: record.get('headline'),
        record: record
    });

 } else {

    // we're popping back to the view
    // from a "back" button push
    this.getMain().pop('notificationdetail');
 }
}

Happy coding!

Categories: Sencha Touch 2

Best Restaurants at Disney

June 20, 2014 1 comment

My annual pilgrimage to Orlando, FL is quickly approaching where I will once again pay the national tax on having children (commonly referred to in the USA as “theme park admission”). With this in mind, I thought that I might divert from writing about technology for a brief moment and provide some brief guidance to the uninitiated as to where to find some good eats.

But first, a public safety tip:

Do not eat dinner and then immediately ride the Twilight Zone Tower of Terror.

I’ve been to Dis over 15 times and eaten at most of the establishments. I make a point of always experiencing someplace new rather than hitting the same places over and over and, generally speaking, I try to keep my expectations low. I work in Washington, DC which has evolved into quite a foodie-destination over the last decade. I’ve also done a ton of expense-account dining over the years and had some extraordinary meals over the years, so I’m rather difficult to impress. The key to Disney dining is to approach each meal with relatively low expectations.  If you’re looking for great european food, go to Europe because you’re not going to find it at Epcot. In my opinion, Disney has always been best when it sticks to the basics rather than trying to bring great, complex ethnic meals to its customers. Buffet-style arrangements tend to produce better and more consistent results than ordering off a menu.

So, without further ado, Here’s a list of restaurants where the meals exceeded my expectations:

  • Hoop-De-Do Musical Revue @ Fort Wilderness Campground (American BBQ)
  • Flame Tree BBQ @ Animal Kingdom (BBQ)
  • Whispering Canyon Cafe @ Wilderness Lodge (BBQ)
  • Artist Point @ Wilderness Lodge (Pacific Northwest)
  • Boma @ Animal Kingdom Lodge (African)
  • Citricos @ Grand Floridian (American / Mediterranean)
  • Yak & Yeti @ Animal Kingdom (Asian)
  • Cabana Bar and Beach Club @ Dolphin Hotel

A few notes:

  • I have not yet dined at Victoria and Alberts (Grand Floridian) which is generally considered to be the best restaurant in Orlando.
  • I had a great meal at O’hana many years ago, but recent reviews have not been kind.
  • My business associates tell me that we had a great dinner at Bongo’s, but I think that’s just the Mojito’s talking…
  • Had a great time at Chef Mickey’s for breakfast one year, but was disappointed the next. I think that it’s one of those places that you should experience once — but only once.
  • Aborted an attempt to dine at Liberty Tree Tavern after we arrived on-time for our reservation but weren’t seated for over 40 minutes and finally gave up in frustration.

This year we’ll be dropping by a few new places listed below. Somehow I doubt that the sushi from Splitsville is going to get on my list, but who can turn down raw fish served at a bowling-alley? What could possibly go wrong? I do have some high hopes for Liberty Tree Tavern, assuming that we don’t have another reservation melt-down….

  • Tony’s Town Square Restaurant (Magic Kingdom)
  • TomorrowLand Terrace Desert Party
  • Luau Cove
  • T-Rex
  • Rose & Crown Dining Room
  • Biergarten Restaurant
  • Splitsville Luxury Lanes
  • Tusker House (return visit)
  • Narcoossee’s
  • 1900 Park Fare
  • Liberty Tree Tavern

What are your favorites? Feel free to enter a comment in the space below!

 

Categories: Dining
Follow

Get every new post delivered to your Inbox.

Join 569 other followers