Drupal Planet
drunken monkey: Updating the Search API to D8 – Part 4: Creating plugin types
The new plugin system is another large and important change in Drupal 8. While you were pretty much on your own if you wanted to provide some kind of plugin system in Drupal 7 (which a lot of modules do – think Views, Rules, etc.) there is now a very sophisticated framework to easily implement plugins and to define your own plugin types in a clean, extensible and very flexible way. And the good news for everyone familiar (code-wise) with the Search API in Drupal 7: the system is very much in line with how Search API implemented plugins itself, so the switch is pretty easy for us.
What is a plugin?There have already been a lot of great blog posts about the plugin system in Drupal 8, so I won't go into much detail here. Generally speaking, a plugin is a self-contained unit providing a certain, well-defined functionality. A plugin type is defined by a single, central system (which is usually also the only place where the plugins are used), but other modules can then provide their own plugins with an individual implementation of that functionality.
Probably not a very helpful explanation, it's not that easy to explain – but, to use an example, blocks are now plugins in Drupal 8. The Block module provides the "Block" plugin type, which defines how blocks can be displayed on a page and how they can be configured. The Block module also provides all the code utilizing those plugins, letting the user place blocks on the site, ensuring they are displayed when and where appropriate, etc. Other modules then provide implementations of that "Block" plugin type – i.e., individual plugins – that implement different kinds of blocks: the User module provides a block with the login form, the Language module the "Language switcher" block, the Book module a block displaying the current book's navigation, etc.
But there are literally dozens of other types of plugins already included in Drupal core, and contrib is bound to add countless more – four (currently) by the Search API, for example. Entity types, the various Views handlers (fields, filters, sorts, …) and plugins (display, row, …), field types, WYSIWYG editors – all of those are now just different types of plugins, all using one new, large, unified framework to provide their functionality.
So, what do we need to add our own plugin type to that growing list? Well, basically just a plugin manager, which (as explained last time) is a special type of service – that's it. Your new plugin type becomes available by just implementing a single method. Of course, to make your plugin type more useful and easier to use for others, you should add a few more things: an interface describing what plugins of that type should be able to do, an abstract base class for implementing that interface (unless it doesn't make sense in your case) and (optionally) a dedicated annotation for your plugin type.
Let's go through those with the example of Search API processors.
The plugin managerAs mentioned earlier, the plugin manager is simply a special kind of service. There is a base class (DefaultPluginManager) which already takes care of 95% of what's needed, you literally only need to override the constructor. So, we save the following in search_api/src/Processor/ProcessorPluginManager.php:
<?phpclass ProcessorPluginManager extends DefaultPluginManager {
public function __construct(\Traversable $namespaces, CacheBackendInterface $cache_backend, ModuleHandlerInterface $module_handler) {
parent::__construct('Plugin/search_api/processor', $namespaces, $module_handler, 'Drupal\search_api\Annotation\SearchApiProcessor');
$this->setCacheBackend($cache_backend, 'search_api_processors');
$this->alterInfo('search_api_processor_info');
}
}
?>
(To save space, I skip the namespace, imports and comments for all examples.)
This can be used as kind of a template, the base class will then take care of everything that's necessary to make the plugin system work. Here's what each of the lines do:
- The __construct() method signature contains the service's dependencies, which will be injected when the plugin manager is constructed. $namespaces basically contains all root folders for sub-namespaces – i.e., every module's src/ and/or lib/ directory, plus a few more. The others are the services that handle caching and module interaction (e.g., invoking hooks).
- The parent constructor sets the plugin manager up for this specific plugin type. The two variables are of course just passed on as-is, the other two parameters represent the following:
- Plugin/search_api/processor is the sub-dir (and therefore sub-namespace) in which plugins of this type have to be located for automatic detection. That means, e.g., if the Views module would want to provide a Search API processor, it would have to be in namespace \Drupal\views\Plugin\search_api\processor, otherwise it wouldn't be found.
The general standard for this parameter is either Plugin/[module]/[component] (like here) or Plugin/[ModuleComponent] (so, Plugin/SearchApiProcessor in this case). I think the former is generally preferred if your module defines more than one plugin type, while the latter can be cleaner/more practical if your module only defines a single one – I'm not completely sure myself, though. They probably shouldn't be mixed in a single module, in any case.
(As a side note, I cheated here a bit, we currently use the sub-dir Plugin/SearchApi/Processor for processors. As I'm pretty sure that's wrong, though, we'll change that shortly.) - \Drupal\search_api\Annotation\SearchApiProcessor is the fully qualified class name of the annotation that will be used for defining plugins of your type. (See here if you have no idea what I'm talking about.) You can just skip this parameter, in which case it will default to \Drupal\Component\Annotation\Plugin – i.e., the generic @Plugin annotation. Core's validation constraints, e.g., use this instead of an individual annotation. Don't ask me what the decision process is here, but I'm pretty sure it's generally encouraged to use a dedicated annotation for your plugin types.
- Plugin/search_api/processor is the sub-dir (and therefore sub-namespace) in which plugins of this type have to be located for automatic detection. That means, e.g., if the Views module would want to provide a Search API processor, it would have to be in namespace \Drupal\views\Plugin\search_api\processor, otherwise it wouldn't be found.
- The second line just sets up caching for the plugin type, to avoid having to re-discover the available plugins each time they are required. search_api_processors here is the cache key that will be used to store the cached entries. You can also pass an array of cache tags as an optional third parameter to setCacheBackend(), which can be useful in special scenarios (i.e., when the cached plugin data should always be cleared alongside other cached data) but usually shouldn't be done.
- The last line just determines that modules can alter the available processor definitions using hook_search_api_processor_info_alter(array &$definitions). Make sure to also document that hook in your module's MODULE.api.php file!
And that's already it for the class, all very straight-forward. Now, the only thing that's missing for the service is its entry in the search_api.services.yml file:
services:plugin.manager.search_api.processor:
class: Drupal\search_api\Processor\ProcessorPluginManager
parent: default_plugin_manager
The service ID here is generally plugin.manager.[module].[component]. Then we simply specify the class we just created and use the handy parent shortcut to copy over the same constructor dependencies that (usually) all plugin managers have. This is no higher magic but just references (and partly copies over) the following definition from core.services.yml:
services:default_plugin_manager:
abstract: true
arguments: ['@container.namespaces', '@cache.discovery', '@module_handler']
You'll recognize the arguments as exactly those we used in our plugin manager's constructor.
The @SearchApiProcessor annotationSince we wanted a dedicated annotation instead of the generic @Plugin, we still have to create the SearchApiProcessor annotation class to get the plugin type basically working. This is also very easy, we just put the following into search_api/src/Annotation/SearchApiProcessor.php:
<?phpclass SearchApiProcessor extends Plugin {
public $id;
public $label;
public $description;
}
?>
That's all – you just define your plugin annotation's properties and inherit from Plugin, that base class will take care of everything else. The properties aren't even necessary, they are just there to document for other developers what properties the plugin type expects in plugin definitions. (And, though not used in this case, to set defaults for optional properties.) The base class doesn't care about that, it will just save all the properties it encounters in the definition, regardless of whether they are defined or not. (Note, though, that the id property is "magic" – if you want to use that for anything other than the plugin ID (and have the plugin ID in a different property), then you'll have to override getId() in your annotation class.)
The only thing you do have to take care of is that annotation autloading will only work if you put the annotation into the \Drupal\[module]\Annotation namespace.
With the above done, you already have a fully functional new plugin type: other modules can define their own Search API processors and we can easily get all defined processors with the following bit of code:
<?php$processor_plugin_manager = \Drupal::service('plugin.manager.search_api.processor');
$processor_definitions = $processor_plugin_manager->getDefinitions();
foreach ($processor_definitions as $processor_id => $processor_definition) {
$processors[$processor_id] = $processor_plugin_manager->createInstance($processor_id, $processor_settings[$processor_id]);
}
?>
However, while this is a complete working example as far as Drupal's plugin framework is concerned, it is of course not really practical since we don't specify anywhere what we expect from processor plugins, so modules that want to provide their own processors don't know which methods to provide and how they should behave (without combing the whole module code of the Search API for any calls to processor plugins). So, it is more or less required to also define an interface for plugins of your new type that implementing modules would have to use.
The specific methods in the interface of course differ from case to case, but there are a lot of interfaces (and corrsponding base classes or traits) provided in Drupal Core for functionality that is often required for plugins. E.g., PluginInspectionInterface lets you retrieve a plugin's ID and definition, DerivativeInspectionInterface helps dealing with plugin derivatives (look out for an upcoming blog post about those) and PluginBase is a great base class for plugins which implements both of these interfaces and additionally provides a few other handy methods for child classes (first and foremost, a t() method for doing dependency-injected translation). PluginFormInterface provides methods for plugins that should have a configuration form, usually used in conjunction with ConfigurablePluginInterface, which represents plugins with configuration. And ContainerFactoryPluginInterface, as the last one used in the Search API, provides a static create() method for easily implementing proper dependency injection. There are more, though, so take a look through the \Drupal\Component\Plugin and \Drupal\Core\Plugin namespaces before adding custom methods to your plugin interface.
The whole interface then looks like this (conceptually):
<?phpinterface ProcessorInterface extends PluginInspectionInterface, DerivativeInspectionInterface, DrupalConfigurablePluginInterface, PluginFormInterface, ContainerFactoryPluginInterface {
// Put plugin type-specific methods here.
}
?>
This interface is usually put into the same directory (and, therefore, namespace) as the plugin manager (since there is nothing else that really links the plugin manager to the interface), as is a default base class for the plugin type that implements the interface and helps modules avoid boilerplate code when providing their own plugins:
<?phpabstract class ProcessorPluginBase extends PluginBase implements ProcessorInterface {
// Here, provide default implementations for all methods of the interface (for which it makes sense).
}
?>
And that's all, now you've completely defined a new plugin type for your module. Modules can now provide plugins of that type like this:
<?php/**
* @SearchApiProcessor(
* id = "example_some_processor",
* label = @Translation("Some processor"),
* description = @Translation("Description of the processor.")
* )
*/
class SomeProcessor extends ProcessorPluginBase {
// Plugin code.
}
?>
Image credit: findicons.com
ThinkShout: Online Fundraising with RedHen Donation
I’ve spent many a late night in the office shouting at my computer screen - all in the name of philanthropy. Technology woes just go with the small nonprofit territory, and the organization I worked for before joining ThinkShout was no exception. Some of those woes, however, I just couldn’t tolerate. Not when they were getting in the way of our mission, and preventing us from raising funds to support our work.
My first foray into online fundraising happened while I was working in the development department of this small nonprofit. We focused primarily on assisting low-income families with safety net services and emergency housing. Over the last few years, we’d seen a huge increase in the percentage of online donations. Times were changing and we’d begun to rely heavily on that online donation page as we saw more and more donors taking the digital route - many of whom were donating to us for the first time. But as our needs evolved, we needed the donation form to evolve with us. We needed more options, more fields, new content as the campaigns changed. Unfortunately, we didn’t realize what a monumental task that would be, given the online donation solution we were using. The problems didn’t end there.
This set-in-stone donation solution probably wasn’t a big concern back when the development team was first evaluating CRMs. Perhaps they didn’t know what they were getting into. Perhaps they didn’t realize they’d be charged almost $200 every time they needed to edit an online donation form. Perhaps they didn’t think we’d use the form that much. The form itself wasn’t very nice to look at. It didn’t match our website’s theme. It got close, but something was always off. There was absolutely nothing convenient about dealing with it.
Those days are a ways behind me, but they’re still on my mind. Since my transition to ThinkShout, I’ve had the opportunity to view these issues from a different perspective and I tend to hear the same complaints from the nonprofits we serve:
- Multi-step checkout
- Clunky check-out systems
- No control over form content
- No support for multiple campaigns
- Forms are expensive to edit
- Can’t fix issues without ticketing your CRM’s support team
- Form theme doesn’t match your website
- Payment methods are limited
Wow. Look at this awesomely long list of annoying things that plague such an integral part of nonprofit fundraising. My blood still boils when I think about how difficult a decision it was to simply submit a ticket to change a single line of text in our donation form - how much effort went into making a case for the cost. This had to change. There had to be something better.
We began working on the RedHenDonation module with intent to create a better online donation solution. We wanted to create a donation tool that allowed nonprofits to keep everything under one roof. The end result accomplishes this. What we came up with is unique in the sense that it offers nonprofits several features that aren’t quite so commonplace. RedHen Donation is inside their website, connected to their CRM. It allows for both one-time and recurring donations, multiple donation pages, and easy-to-edit donation forms.
I chatted with Brandon Lee, the architect behind RedHen Donation, to get a better understanding of RedHen Donation’s capabilities.
Stephanie: So what’s the big deal with RedHen Donation?
Brandon: It’s an integration between RedHen CRM contacts and Drupal Commerce that allows for single page donations with both one time and recurring donations options. The recurring donations feature is a pretty big part of it, since that’s not an option you frequently see offered in other donation form tools.
Stephanie: How does it work with RedHen CRM?
Brandon: RedHen Donation allows you to create a single page donation form that can create or update RedHen contacts. It’s tightly-integrated with RedHen, so if you’re already using RedHen CRM, it will be very simple to get up and running.
Stephanie: How does it compare to other donation form modules out there?
Brandon: It’s a standalone module that’s used with the RedHen suite. CRM Core has a similar donation module, but it doesn’t allow for recurring donations. RedHen Donation does. The ability to give donors the option of setting up a recurring donation within the same form is something we’re thrilled to offer nonprofits.
The donation space, in a lot of ways, is what sets it apart from similar modules. If you’re dealing with Commerce specifically, it wants to lock you into the Commerce workflow. With RedHen Donation, you don’t have to go through three different steps to make one donation to an organization. You fill out one page and you’re done. It’s quick and user-friendly. No more filling out a form, then passing through a portal, then onto a confirmation page, then back to your site.
Stephanie: What if I want create a new form?
Brandon: With the assumption that you already have RedHen set up, it’s relatively simple. You enable the module, then create a donation type, which is a Drupal entity, and during the configuration, you select what kind of product you’re working with - at least with one-time donations. This is a fieldable methodology that we’ve used with MailChimp and other projects in the past. If you understand Drupal concepts of entities, fields, and commerce, then it’s simple. You do need a working knowledge of those three things to build a form. But once it’s complete, it’s incredibly simple for donors to use. That’s the whole point of the single page donation form. You can offer multiple recurring options. Every month for forever, every month for a year, all of those options on the same page.
Stephanie: It sounds like you’ve got a lot of payment options to choose from as an end-user - and the recurring donation option is huge win for fundraisers - but how customizable is this form? And how difficult is it to make changes?
Brandon: It allows for different payment gateways, like iATS - basically, anything supported by Drupal Commerce. Create a contact, indicate which fields you want created, pick a payment gateway, and you’re set. It can be as simple as name, credit card number, and donation amount. It can be as complex as name, credit card, honoree, honoree address, honoree’s favorite color, etc. Payment options allow for text fields, a drop-down menu, or other. Recurring payments open up another myriad of options.
Stephanie: So long as I know Drupal, I can make this form into whatever I need?
Brandon: Exactly.
Stephanie: Say I have multiple campaigns I’m running and I need a different form for each campaign. Can I do that?
Brandon: Absolutely. Let’s say you have multiple different contact types in RedHen because of the campaign. And say I want to send out this particular form because I’m currently soliciting for one particular campaign. You can have as many donation forms as you have entities that are fieldable. You can create a different form on the same content type.
Stephanie: Anything else we should know about it?
Brandon: The availability of RedHen Donation within RedHen CRM is a big deal. You don’t have to worry about third-party integration. You don’t have to send people off to another site to collect the donation. You don’t have to wait on a third-party to send that much-needed information back. With RedHen Donation, it just works. It’s all self-contained. If I’m a nonprofit site admin, this is a great option, and a great reason to consider using RedHen CRM because everything needed to collect donations is all in one place.
Our Conclusion:
RedHen Donation is a fantastic asset for nonprofits looking for a flexible, customizable, and affordable online donation tool. Its single-page functionality and recurring donation options make it a must-have for organizations that need these features to work seamlessly with their Drupal website. For organizations currently shopping around for a CRM, RedHen Donation is a huge plus for RedHen CRM and a compelling reason to select it as your constituent database.
We’re incredibly excited about RedHen Donation and what it offers nonprofits. We do understand that seeing is often believing, so we highly recommend that you install the latest release of RedHen CRM, RedHen Donation, and see what we’re talking about. Not sure how to get started? Do you have more questions than what we covered? Leave us a comment - we’d love to talk RedHen with you.
KnackForge: How to add contextual link on a block
Code Karate: Drupal 7 Entity View Modes Module
The Drupal 7 Entity View Modes Module allows you to define custom view modes for your entities. A view mode allows you to configure which fields on your entity you want to display. For example, Drupal by default has a Teaser view mode that you can set up to display different fields from your Full Content view mode. This module will allow you to add your own view modes on top of the ones Drupal gives you by default.
In this lesson you will learn:
Tags: DrupalEntitiesDrupal 7Site BuildingDrupal PlanetAcquia: Michael Schmid – Drupal 8 means better business
Michael Schmid, CTO of Amazee Labs, and I got the chance to talk in front of my camera during the Drupal Developer Days in Szeged, Hungary. As the technical lead of a successful and growing Drupal shop, I was keen to get his perspectives on how the technology of Drupal helps him do business and how Drupal 8 might help him and his clients even more than ever before.
Chapter Three: Reviewing Code with PHPStorm
One of the main responsibilities of a Drupal core committer is doing a final review of patches before committing them. Since July 1, 2014, I’ve committed over 250 patches to Drupal 8. Thanks to Chapter Three for making that possible.
Reviewing code is time consuming. Here are some things that make my job easier:
Phase2: Introducing OpenPublic’s App Strategy – Re-imagining Content Management For Public Sector
Since it was first developed, OpenPublic has redefined open source public sector content management. Packaging government-focused functionality into a secure Drupal distribution, once a radical notion, is now an established open source web solution. As the foundation of many of Phase2’s public sector site builds,
OpenPublic has demonstrated the importance of solutions that are replicable, that can prevent duplication of services, and provide examples of repeatable best practices. OpenPublic serves as accelerator for building good government web sites and it contains best practices and features around mobility, security and accessibility.
With the release of OpenPublic candidate 1, we’re simplifying content management in the enterprise public sector space by appifying all functionality and content types in OpenPublic. What was once a wilderness of modules and configuration will be encapsulated in a clean collection of Apps, making all OpenPublic’s out-of-the-box functionality simple to configure by non-technical site administrators. This new App strategy will make it easier and cheaper for governments to implement the web functionality they need.
So, what is an App?It can be confusing to pin down definitions for terms like: modules, features, distributions, and Apps. An App is simply a collection of code (modules, features or themes) that delivers a distinct piece of functionality upon installation. Like the “apps” on your smartphone or tablet, Drupal Apps provide functionality for a specific purpose – quickly, easily, and independently.
In 2011, Jeff Walpole introduced the concept of Apps for Drupal distributions and the new Apps module. Apps improve usability for site administrators, particularly as compared to the traditional Drupal Configuration dashboard. From the beginning, Apps added extensibility and interoperability to Drupal. Now, instead of adding Apps for extensibility, we’re appifying all distribution functionality for OpenPublic 1.0, finally giving the content administrators full configuration control.
Appification of OpenPublic For State and Local Government One Code Base, Many DepartmentsOpenPublic Apps provide city, county, and state government agencies the ability to turn on and off independent pieces of functionality without affecting any other functionality on their platform. Many public sector agencies require a unified CMS spanning all departments and agencies. OpenPublic provides this through standard Apps developed for government needs, including directory management, stringent security, and editorial workflow. However, the flexibility of OpenPublic 1.0’s Apps also allows for specific functionality by department. This means that trying out new functionality is as easy as turning an App on or off, giving governments the opportunity to test innovative approaches without heavy risk or cost of implementation. See how San Mateo County uses OpenPublic Apps.
Simplified ConfigurationApps take Drupal development out of the equation, empowering site administrators to skip technical development when configuring individual department sites. Each App is self contained, so changing the configuration does not cause problems with other site features.
Custom Functionality Made EasyWith OpenPublic, users can develop Apps specific to their objectives. San Mateo County, Calif., for instance, used OpenPublic to develop an App which adds custom location mapping to any page on the web platform. Once created, the San mateo County team was able test their new App in one department, then enable it for other departments when it was deemed successful. The sky’s the limit with OpenPublic’s new App structure with unique and flexible functionality for public sector platforms.
OpenPublic is breaking new ground in web experience for public sector site administrators and visitors alike. With the Appification of all functionality in OpenPublic, we are knocking down traditional barriers to Drupal site maintenance and scalability with intuitive configuration. Stop by Experience Director Shawn Mole’s Capital Camp session with Greg Wilson, Director of Public Sector Practice, to learn more about how OpenPublic is truly the next generation of open source government sites.
Mediacurrent: Securing your Data in Drupal
As data breaches are on the rise, protecting sensitive data is more important than ever - for you and your customers. Recently, our partner Townsend Security's Luke Probasco shared some best practices with our team for securing data in Drupal, meeting compliance regulations requiring encryption, and the importance of encryption key management.
Drupal core announcements: Migrate module automated test failures - policy change
The Drupal 8 committers have slightly changed the policy regarding automated test failures as we move toward release of 8.0.0. The traditional policy is that any proposed core patch must pass all automated tests at commit time. We are amending that slightly such that patches which solve Criticals, but fail tests in a non-essential component, are allowed to proceed. For now, the only non-essential component is the Migrate framework. As announced previously, the Migrate framework is not release blocking for 8.0.0 so it should not hold back these Critical patches.
When we do have a patch which is Critical but cannot easily be adjusted to pass Migrate's unit tests, we will
- File a postponed Major followup issue tagged 'Failing Migrate test'. Describe the test that was disabled and the challenge in making it pass. The issue should link to this post so that folks understand why the test was disabled.
- In your patch for the Critical issue, change the name of the failing test method from testFooBar() to failingTestFooBar() and add a @todo to its docblock referencing your follow up issue.
- If we ever get Add "expected fail" functionality to simpletest (help wanted!), we will use that to skip tests instead of changing the method name.
Before release, we will revisit all issues tagged with 'Failing Migrate test' and make a determination about Migrate's suitability for 8.0.0. If deemed unsuitable, Migrate (or a specific migration path) can be added back into 8.1.0 or later.
Drupal core announcements: Drupal 8.x is now 8.0.x - new branch
You may have noticed that as of a few days ago that there is a new branch in the Git source code repository for Drupal Core: 8.0.x!
This branch introduces "Semantic Versioning" for Drupal 8, which will allow us to have an 8.0.x release line, followed by an 8.1.x release line, and possibly more along the same lines. Unlike the drastic API and architecture changes that would accompany a 7.x -> 8.x or 8.x -> 9.x major release update, and correspondingly long timelines, the changes between 8.0.x and 8.1.x are expected to be more incremental (adding features and minor API refinements). Also, the time between 8.0.x and 8.1.x is expected to be correspondingly shorter. This change in philosophy will take some of the pressure off developers to make sure absolutely everything they want gets into the 8.0.x release, because new features can be added in the 8.1.x line.
Practical implications:
- All of the existing 8.x issues in the Drupal Core issue queue have been moved to the 8.0.x branch.
- Some issues may be pushed off to 8.1.x or 9.x by the branch maintainers: 8.1.x for more incremental changes, and 9.x for more drastic changes; in both cases, for issues that will not make it into 8.0.0.
- Do not file any more 8.x branch issues.
- Switch your local git repositories to use the 8.0.x branch.
- The 8.x branch will be emptied and deleted sometime soon, probably after the next Alpha release, which is currently scheduled for August 6th (see https://groups.drupal.org/node/435848)
- There is also an 8.1.x branch and a 9.x branch. These exist only for purposes of marking issues as "future material". Commits to Drupal 8 Core will be made on the 8.0.x branch only, at least for the time being. We'll make another announcement when it's time to actually start working on the 8.1.x branch.
For background information, you can read all the discussion on this issue: https://www.drupal.org/node/2135189
Phase2: Static Drupal – Taking the pain out of Drupal hosting
All that changed with more Dynamic websites like Drupal. We get a lot for using Drupal. We get a first class CMS, huge ecosystem of modules and the ability to change content live on the server. But we also get a lot of headaches along with it. We have to design a server system that can handle the load, be constantly monitoring it for failures, apply security updates and be vigilant of security breaches and be ready to be woken up when something doesn’t work.
We recently were challenged to find a way to use the power of Drupal as a CMS but serve the site as generated static pages. It was an interesting idea and we decided to see if we could make it work. We started out by building a proof of concept module that would take a Drupal site and render it as plain HTML pages with all the associated assets such as JS, CSS and images. This turned out not to be too hard once we used a little apache rewrite rule to fix url paths. We decided to call this module the Static module. It is available on Drupal.org (http://www.drupal.org/project/static)
By itself this module turns out to be really useful for two use cases. Static websites There is a class of website that is built in Drupal but is rarely if ever updated. There are no “community” bits to it like comments and webforms. For this type of website, you can use the static module to periodically export your entire Drupal site to static html that can be uploaded to a simple server without PHP, MySQL, memcache or anything other than a web server. If changes need to be made, they can be made on a laptop or staging server and then exported and uploaded again. This allows both viewing the full site for acceptance testing before it goes live and all of the performance and security gains of a static website. Archive websites A similar type of website is an archival site such as the various DrupalCon and DrupalCamp websites. While they no longer need to be live Drupal sites or be updated, they should still be accessible to see the content that was there. In the same way as simple Static websites, archived websites can be generated, uploaded and then left alone. Static websites are great and all but we can actually take it one step further. Turns out this module is also even better for building Hybrid websites. Hybrid websites A hybrid website is the primary reason we created the static module. We wanted a system where Drupal was installed behind a firewall and a static site was periodically generated and copied over to a public web server. This would allow content writers to create, update and delete content within Drupal which would track what is changing and within a short period of time it would be generated and transferred to the public facing site. Essentially it would have the power of content editing in Drupal but be hosted to the public as static HTML. We got this system working and I’m going to share more about it at my talk during CapitalCamp/Drupal4Gov. Be sure to check it out if you are coming.ThinkShout: Responsive Images in Drupal with the Picture Module
In today’s web device climate, you never know if your site will be viewed on a laptop, a tablet, a phone, an 84-inch 4k monitor, a Blu-ray player, a gaming console, or a refrigerator. Most of us have probably experienced the frustration of using a website that displayed poorly because of its inbuilt assumptions about what the user’s screen would look like. The ability for your web content to adjust to its context---in particular, screen resolution---is critical to making sure you deliver the best web experience possible to every user.
The tools and techniques to do so are known as responsive web design (RWD). One of the first high-profile sites to implement RWD was The Boston Globe, which is a great example to take a look at. RWD in general is beyond the scope of this blog post. Today, we’ll focus on a specific bit of RWD that is a little tricky to handle in Drupal: responsive images.
If you just did what I first did and typed drupal.org/project/responsive_images into your browser to see what popped up, you’ll see a module that is no longer actively maintained. Many responsive image projects have come and gone over the last few years, with varying approaches and degrees of success; it’s a crowded space.
Luckily, Drupal 8 will feature a responsive image handling solution in core with the new Picture module, and it has already been backported to Drupal 7. It’s a bit tricky to set up with configurations spread across several different GUI menus, but once you have it running, it’s a fast, smooth solution to an important challenge, and it plays well with its neighbors.
The Gist Of ItWe'll be dealing with a handful of new objects to get responsive image behaviors going smoothly.
- Breakpoints
- Breakpoints are ranges of screen sizes, described by conditional tests (i.e., minimum width = 640)
- Breakpoint Groups
- EWISOTT (Exactly What It Says On The Tin)
- Image Styles
- You may already know these from the Media module; they let you bundle dimensions, scaling modes, etc. into styles that can be reused across your site.
- Picture Mappings
- Picture mappings pair up breakpoints with image styles
Once an image is associated with a responsive style, the Picture module will check the page dimensions, look at the breakpoint group, find the first breakpoint that applies to those dimensions (we'll come back to this point...), look at the picture mapping to find the associated image style, and apply that style to the image. This happens in real time, so a user resizing their window should see the image rescale to fit their new window size instantaneously.
InstallationWe'll use Drush, a Drupal cli, to install the modules and their dependencies, and to enable them.
Picture has two important dependencies:
- The Breakpoints module, which will keep its eye on the browser window size
- The Chaos Tool Suite, which gives us lots of handy development tools and APIs
Drush will handle the dependencies for you; just navigate to your site root and type:
drush en picture -yWe'll also want the Media module:
drush en media -y(In the above commands, -y just tells drush to assume "yes" for any requests for confirmation.)
Setting Up BreakpointsBreakpoints can be found under Configuration > Media > Breakpoints. Each breakpoint needs a name and a media query. Optionally, you can enable Retina display handling for each breakpoint.
Note that the smallest breakpoint is set to a 0px minimum. This ensures that arbitrarily small screen sizes will be accommodated.
OrderingThe order in which the breakpoints appear is the order your breakpoint group will check their media queries. The example configuration uses minimums in decreasing order, which is preferable for responsive image design. If a breakpoint query fails (if the screen width is below the minimum), the next breakpoint down the line will be checked. Make sure you get this order right; once you pull these breakpoints into a group, their order cannot be edited; you'd need to delete the breakpoints and their group and start over.
GroupsClick 'Add a new group' to define a Breakpoint Group. The ordering on this screen will match the order defined by weights in the previous step.
Note that once a breakpoint has been added to a group, it cannot be edited.
Responsive StylesThis is an optional step provided by the Breakpoints module; it's essentially a wizard which makes copies of a preexisting image style, one for each selected breakpoint. If you have some image style effects you want to apply everywhere (desaturate, perhaps?), this can be a handy time saver. For general use, it's not really necessary.
Image Styles and Picture MappingsSet up an image style for each breakpoint under Configuration > Media > Image Styles. For general use, these can be equal to or slightly less than the minimums of the associated breakpoints; for more complex layouts, id est columns, these might instead be set to match the behavior of the column widths.
Picture Mappings are found under Configuration > Media > Picture Mappings. First, associate the new Picture Mapping with our Breakpoint Group.
Now that the Picture Mapping has a Breakpoint Group, each breakpoint can be associated with an image style. Populate these with the image styles defined previously, and hit Finish.
File Type DisplayUnder Configuration > Media > File Types, select Images -> Manage File Display. Enable the Picture display mode, and select the Example Group.
Content TypeNow we're ready to create a node type with a responsive image field.
Make a content type and add a File field with the Media File Selector widget. Make sure that the field permits the image format file extensions you plan to use; by default it only allows *.txt.
Under Manage Display, make sure that the responsive image field is set to the Rendered File display formatter, which will connect the field to the file display mode we set earlier.
The End ProductWe're done! Create a node with the example content type, add an image, and start dragging the corner of your window around. The image should resize as the window width passes between breakpoints.
The Benefits
There are several advantages to responsive web design, some of which are particular to image loading.
- Controlled, consistent user experience across devices
- Without clear knowledge of how our sites will be viewed, we cannot effectively design them to meet user needs.
- Ease of navigation
- Never let important elements render offscreen.
- Bandwidth conservation
- Don't send a 4k image to a QVGA-screen phone that doesn't need it.
- Code it once
- No need to build a secondary mobile site.
- SEO optimization
- Every node has one canonical URL, so Google won't split its results between mobile and desktop versions (which could easily drop your site to the dreaded second page of search results)!
- Shareability
- A bad example: Wikipedia. Take a look at how different its mobile version looks. If a mobile user posts an interesting article to Twitter, for example, both desktop and mobile users following the link will be hit with the mobile version, regardless of their device. With a single-URL responsive design, this is a nonissue.
Pixelite: How to create a dashboard with Dashing and integrate with Drupal data
Dashing is a Sinatra (think ruby but not rails) based framework that lets you build dashboards. It was originally made by the guys at Shopify for displaying custom dashboards on TVs around the office.
Why use DashingDashing makes your life easier, freeing you up to focus on more inportant things - like what data you are looking to display, and what time of widget you want to use.
Features of dashing:
- Opensource (MIT license)
- Widgets are tiny and encapsulated, made with SASS, HTML and coffeescript
- The dashboard itself is simply HTML and SASS, meaning you can theme and style it to suit your needs
- Comes bundled with several powerful widgets
- Widgets are powered by simply data bindings (powered by batman.js)
- Push and pull methods available to each widget
- Pull jobs can be configured to run in the background on a set interval (e.g. every 30 seconds, poll Chartbeat for new data)
- Layout is drag & drop interface for re-arranging widgets
There are several dashboard modules in Drupal, and yes you can go to a bit lot of trouble and re-create the power of Dashing in Drupal, but there is no need.
Dashing is great at what it does, and it only does one thing.
Another advantage is that you can query other sources of data - e.g. Google Analytics or MailChimp and display metrics from those applications on your dashboard.
A really great example (including code) and can found over at http://derekweitzel.blogspot.co.nz/2014/03/a-hcc-dashboard-with-osg-accounting.html
Installation of Dashing on Ubuntu 14.04The only real requirement is ruby 1.9+ (this comes by default in Ubunty 14.04, in Ubuntu 12.04 you need to install ruby-1.9 explicitly)
sudo apt-get install ruby ruby-dev nodejs g++ bundler sudo gem install dashingyou can create a new dashboard with
dashing new awesome_dashboard cd awesome_dashboard bundleYou start the application by
sudo dashing startYou now have a dashboard on http://localhost:3030 ready to go
Creating a new Dashing widgetThere are already a few tutorials online, the best of which is probably just the existing suite of widgets available.
Here we will go through a simple example where we want to graph the of pieces of content in the "Needs review" state (provided by Workbench moderation) in Drupal. This serves as a mini-todo list for content authors, as ideally this number should be as low as possible.
In this example, we are re-cycling the "List" widget.
Place an instance of the "List" widget on a dashboard - e.g. sample.erb
<li data-row="1" data-col="1" data-sizex="1" data-sizey="1"> <div data-id="newsarticlesreview" data-view="List" data-unordered="true" data-title="News articles in 'Needs review'"></div> <i class="icon-check-sign icon-background"></i> </li>Create a new job to poll for data
Create a new file in jobs/newsarticlesreview.rb, and place:
#!/bin/env ruby # encoding: utf-8 require 'net/http' require 'uri' require 'json' # TODO replace with a real production host server = "https://localhost" SCHEDULER.every '30s', :first_in => 0 do |job| url = URI.parse("#{server}/api/content/dashboard?token=FawTP0fJgSagS1aYcM2a5Bx-MaJI8Y975NwYWP12B0E") http = Net::HTTP.new(url.host, url.port) http.use_ssl = (url.scheme == 'https') http.verify_mode = OpenSSL::SSL::VERIFY_NONE response = http.request(Net::HTTP::Get.new(url.request_uri)) # Convert to JSON j = JSON[response.body] # Send the joke to the text widget review_content = {} review_content['en'] = { label: 'English', value: j['en']['news_article']['needs_review'] } review_content['mi'] = { label: 'Māori', value: j['mi']['news_article']['needs_review'] } send_event("newsarticlesreview", { items: review_content.values }) end Create a Drupal data sourceNow we need to feed the Dashing request with a Drupal API. I have chosen to do all of these custom, as they are straight forward. In theory you could also craft these with the services module as well.
Create hook_menu() entry
/** * Implements hook_menu(). */ function CUSTOM_menu() { // Dashboard API requests. Protected using a token. // e.g. api/content/dashboard?token=FawTP0fJgSagS1aYcM2a5Bx-MaJI8Y975NwYWP12B0E $items['api/content/dashboard'] = array( 'title' => 'Content types broken down by workflow status', 'page callback' => 'CUSTOM_content_dashboard', 'access callback' => 'CUSTOM_dashboard_api_access', 'access arguments' => array('api/content/dashboard'), 'type' => MENU_CALLBACK, 'file' => 'CUSTOM.dashboard.inc', ); return $items; }Here we define a custom route, and declare the access callback. The access callback is special as it needs to ensure that access is restricted to only requests with a special token. The token being created from a hash of the Drupal salt combined with the current path and private key, and base64 encoded (much like drupal_get_token() without the session ID check).
/** * Access callback to the dashboard API endpoints. These are protected by a * token. * * @param String $path * The path that is being requested. * * @return Boolean * Whether or not the use has access to the callback. */ function CUSTOM_dashboard_api_access($path) { global $is_https; // HTTPS only. $only_allow_https = (bool) variable_get('api_https_only', 1); if ($only_allow_https && !$is_https) { return FALSE; } // Only allow get requests. if ($_SERVER['REQUEST_METHOD'] !== 'GET') { return FALSE; } // Check token is correct. $params = drupal_get_query_parameters(); if (!isset($params['token']) || empty($params['token'])) { return FALSE; } $valid_token = CUSTOM_token_validation($path); if ($params['token'] !== $valid_token) { return FALSE; } return TRUE; }And finally the data for the callback.
/** * Gathers current content statistics from Drupal, including the amount of * content broken down by a) content type, b) workflow state, c) status. * * @return JSON */ function CUSTOM_content_dashboard() { $output = array(); $languages = language_list('enabled'); $types = node_type_get_types(); // Workbench states. foreach ($languages[1] as $langcode => $language) { foreach ($types as $machine_name => $type) { // Workbench moderation in use (remove this if you do not have the module). if (workbench_moderation_node_type_moderated($machine_name)) { $results = db_query("SELECT COUNT(n.vid) AS total, w.state FROM {node} n JOIN {workbench_moderation_node_history} w ON w.vid = n.vid WHERE n.type = :type AND n.language = :lang AND w.current = 1 GROUP BY w.state", array(':type' => $machine_name, ':lang' => $langcode))->fetchAllAssoc('state'); foreach ($results as $state => $result) { $output[$langcode][$machine_name][$state] = (int) $result->total; } } // No workbench moderation for this content type, use the status column. else { $results = db_query("SELECT COUNT(n.nid) AS total, n.status FROM {node} n WHERE n.type = :type AND n.language = :lang GROUP BY n.status", array(':type' => $machine_name, ':lang' => $langcode))->fetchAllAssoc('status'); foreach ($results as $status => $result) { if ($status == NODE_PUBLISHED) { $status = 'published'; } else { $status = 'unpublished'; } $output[$langcode][$machine_name][$status] = (int) $result->total; } } } } drupal_json_output($output); drupal_exit(); }And there you have it. Note the above code relies on workbench moderation being present, if you do not have it, simply remove the section of the code that is relevant. Note that the API response is considerably more complex and complete than the example calls for, but this just means you can display more data in more ways on your dashboard.
Here is the finished product:
Extra for expertsCreate a init.d script for dashing, here is a good starter.
CommentsLet me know if you have completed (or started) a recent project to visual data from Drupal (or related third party applications) and your experiences there. Pictures are always welcome.
Tags drupal dashing dashboards data visualisation drupalplanet Source Dashing Category TutorialDeeson Online: Drupal 6 Support after Drupal 8 Official Release - what affect will it have on you?
There's quite a lot of discussion in the Drupal community at the moment about what will happen to support for Drupal 6 sites once Drupal 8 is officially released.
Currently Drupal core only supports one version behind the current stable release. That means that when Drupal 8 has its first stable release (8.0.0) all Drupal 6 sites will no longer have any core support - more specifically security updates.
There was an issue on the D6 queue around this as well as discussions at DrupalCon Austin, which suggested that this support period should be extended to 12 months.
Although there was quite mixed feelings about this, ultimately the security team (along with key module maintainers, and representatives of the Drupal security team’s) responded by suggesting that Drupal 6 will be supported for three months after the first release of Drupal 8.
What does this mean for all our current Drupal 6 sites?Currently there are only three options available for any current Drupal 6 sites:
- Migrate any existing Drupal 6 sites to Drupal 7 (this will still be supported until Drupal 9 is released)
- Wait until Drupal 8 is released and migrate your Drupal 6 site using ‘Migrate in Core: Drupal 6 to Drupal 8'
- Cross your fingers and hope for the best once Drupal 8 is released!
Option 1 is the obvious choice out of these, as Drupal 7 is stable and has been around for several years now and will be supported until Drupal 9 is released.
Option 2 will come with its own problems due to the infancy of Drupal 8 as well as contrib modules that you are currently using that haven’t got a Drupal 8 version yet.
Due to the contrib modules not currently supporting a Drupal 8 version, this could delay you migrating and having to wait beyond the three months for a module to have a Drupal 8 version.
Option 3 is at your own risk, and there are still Drupal 5 sites out there but ultimately this will depend upon what your customers are happy to spend their money on.
According to the usage stats, there are still around 3,800 Drupal 5 sites and just under 198,000 Drupal 6 sites, so there are still many sites which will need to be migrated.
Why are customers resistant to migrating?Our customers are guided (by agencies like us) to keep their sites up-to-date with security releases for core as well as contrib modules etc.
In situations where a core version is due to have support dropped we would normally recommend our clients to start considering planning a rebuild of their site. This rebuild would be on the latest stable release (currently Drupal 7) so that they can continue to maintain the security support.
This also makes it easier (and ultimately cheaper) for them as the community will continue supporting and maintaining contrib modules for the currently supported versions.
Often customers can be resistant to this due to the cost of needing to rebuild their site and migrate their data. This can be understandable, as “if it aint broke, don’t fix it”, or they simply just don’t have the available money to spend on building a new site.
At Deeson Group we currently have several sites that are still on Drupal 6, and even one on Drupal 5.
As an agency we will continue to support Drupal 6 once the official release of Drupal 8 comes out, but this will probably become more painful as time goes on: less people have experience with this version and problems arise with contrib modules which are no longer supported.
Our advice?Get prepared: Make sure that your site is running on Drupal 7 and if not, start thinking about it and planning the migration to Drupal 7 now.
Read moreDrupal 6 Support after Drupal 8 Official Release - what affect will it have on you?By Mike Davis | 31st July 2014
PreviousNext: Using Vagrant: A Drupal Themer's Perspective
On any project, whether you’re a themer, site builder or developer you will need a local development environment to work in. Even if everyone on a project works on the same OS, chances are people will have different versions of Apache MySQL and PHP, different server configuration and not to mention that the local environment will differ from staging and production environments.
This is where Vagrant can help. Vagrant is an open source setup and configuration tool for virtual machines such as VirtualBox or VMWare (and now with Docker support).
In this post, I give a quick overview of how Vagrant and VM's can improve your developer workflow and consistency.
Matthew Saunders: Anatomy of a Drupalcamp - The Tools
I've been involved with Drupalcamp Colorado since 2007. Sometimes I've had a significant role, other times I've taken a bit more of a back seat. I was also pretty heavily involved in Drupalcon when it was in Denver. Over the last 8 months or so, I've had quite a bit more insight into the Cons themselves through my interaction with the Drupal Association. This last year I've been the project manager for Drupalcamp Colorado 2014. This has left me with with some personal insights that might help others wanting to run a camp.
drupaldrupalcamp coloradohow to do a campStanford Web Services Blog: Agile Project Management and its flavors: where does Scrum end and Kanban begin?
Agile has become a big buzz word recently in the project management world. In this post, I'll try to clarify where some of the lines are being blurred between terms like Agile and Scrum, and what some of these terms actually mean.
Keep in mind while reading this that I am primarly trained in Scrum, so my descriptions of other Agile methodologies are only decently informed. :)
What is Agile Project Management?From the ever-present wikipedia:
Drupal 8 and iOS: REST Export for Drupal 8 View
Hi! Drupal Community.
As we all are working very hard for upcoming major release I am feeling excited to write this. This blog is all about REST Export feature of Drupal 8’ view. Let me first brief you about what is the REST api and than I will explain you how REST Export is important for building custom REST api for your Drupal 8.
REST (Representational State Transfer) is a web service architecture to expose your resources to outer world. It follows http architecture to expose the resources of your web site i.e you have GET, POST, PATCH,DELETE etc methods to manipulate resources with out web interface. In context to Drupal , a node will have GET method that will require node id and it will return you details related to that node in XML, JSON or HAL + JSON format. If you have ever used REST api for some famous web services like twitter, facebook then you will find those API very well defined for example GET on /status will give you current status details and GET on /status/comments will give you comments posted on your current status. In context to Drupal we should get node details with GET on /node/{id} and comments on particular node with GET on /node/{id}/comments. In Drupal 8 we have first REST endpoint but we don’t have second. Drupal has opt for different approach. In Drupal you can use contextual filters on view to filter content as per your parameters passed in URL . The same concept we can applies to REST Export . We can attach a REST Export to a view that only show comments with contextual filter with nodeID as parameter. So the rest export will return XML or JSON for comments related to particular node.
Here is a screenshot for it.
We can use REST Export with taxonomy_terms for particular vocabulary as parameter. This concept will make Drupal 8 REST api robust and it also provides good example where view modules and REST module works to gather to provide developer flexibility to develop custom REST api for their Drupal site.
Tags:Drupal core announcements: Drupal 8 alpha 14 on August 6th
The next alpha for Drupal 8 will be alpha 14! Here is the schedule for the alpha release.
August 3rd-5th Only critical and major patches committed August 6th, 2014 Drupal 8.0.0-alpha14 released. Emergency commits only.(Note that there are two concurrent Drupal 8 sprints Aug. 7-10, so we aren't planning to have the usual "disruptive patch window" following this alpha.)