Feed aggregator

OSTraining: New Video Class: Speeding up Joomla

Drupal News - September 28, 2015 - 5:22pm

If you run a Joomla site, then you really need this week's new video class from Rod Martin called "Speeding up Joomla".

Rod starts by showing that a normal Joomla site is not highly-optimized and then he takes you through 10 steps to improve your site speed.

First, you'll learn to use Google PageSpeed and YSlow to test your site. Then Rod shows you how to use caching, compresssion, .htaccess, CDNs and more. By the time you've finished this class, you'll have a blazing fast Joomla site!

Gbyte blog: How to use the Drupal 8 honeypot module efficiently

Drupal News - September 28, 2015 - 2:33pm
The Honeypot module is a great captcha alternative, as it keeps spam bots from submitting content while also saving your site visitors from having to type in mundane character combinations. Configured properly it will prevent the majority of bots from submitting forms on your site including registration forms, contact forms, comment forms, content forms... any drupal forms.

BlackMesh: Cathy Theys is awarded the first Aaron Winborn Award

Drupal News - September 28, 2015 - 9:44am

“Drupal” might as well be Cathy Theys’ middle name. The Drupal enthusiast attended her first DrupalCon in 2011 and the attraction was instant. Since then, she has spent countless hours contributing to Drupal. On Tuesday, September 22, 2015, Cathy’s Drupal dedication was recognized at DrupalCon Barcelona during the Opening Ceremony, as she received the first Aaron Winborn Award. This annual award is presented to individuals who demonstrate personal integrity, kindness, and a commitment to the Drupal community.

Through his contributions and general advocacy, Aaron Winborn introduced countless people to the Drupal community, sharing with them his technical knowledge and insight. Aaron unfortunately lost his battle to ALS this past spring, but to keep his inspiration alive, the Drupal Association created the Aaron Winborn Award to honor and celebrate his life and acknowledge those who continue his passion and commitment to the Drupal Community.

Cathy is well-known as a friend and mentor to all contributors to and advocates of the Drupal community. As a natural leader, she has organized sprints, empowered novices, and promoted all things Drupal. With a strong presence online and in-person at various events, Cathy is a strong believer in spreading knowledge and sharing ideas. Throughout the years, Cathy has exhibited incredible thoughtfulness, integrity, and a dedication to the Drupal Community – qualities and ideals Aaron himself embraced and represented.

Holly Ross, Executive Director of the Drupal Association, presented Cathy the Aaron Winborn award highlighting her efforts to get people involved in Drupal, even Holly herself.  Holly acknowledges her personal experience with Cathy saying, “I can personally attest to it, because she sat next to me in Amsterdam for an hour trying to help me memorize GIT commands.”

Cathy was both surprised and honored to win such heartfelt recognition. We are thrilled to have Cathy Theys as part of the BlackMesh team, and will support her as she continues to give back to the Drupal community.

Congratulations Cathy from all of us at BlackMesh!


View the recorded acceptance: https://events.drupal.org/barcelona2015/driesnote


Code Enigma: Drupalcon Session: Looking for the value in Content Strategy

Drupal News - September 28, 2015 - 7:41am
Drupalcon Session: Looking for the value in Content Strategy Language English Drupalcon Session: Looking for the value in Content Strategy Don't ask yourself: "Can I afford to include content strategy in my web project?"
The real question is "Can I afford NOT to include content strategy in my web project?"

At Drupalcon Europe, our content strategist Koen Platteeuw shared his views on what value content strategy brings to web projects. Here you can watch the recording of the session:



For more information on how Code Enigma can help organisations with their content governance or other content related requirements, check out our Content Strategy services


Why Should you invest in Content Strategy?Blog Editorial Workflows: Who's responsible for my web content?Blog Content Strategy Resource CentrePage Who drives Content Strategy?Blog

Annertech: Looking forward to DrupalCon Dublin 2016

Drupal News - September 28, 2015 - 7:22am
Looking forward to DrupalCon Dublin 2016

I’m writing this while sitting on the plane on my way back to Dublin and thinking about events of the last week. As I’m sure you’re all aware by now, DrupalCon will be coming to Dublin next year. We’re completely ecstatic about playing host to DrupalCon and excited about what this might mean for Drupal and the Irish Drupal community.

Processing expensive back-end operations

Drupal News - September 28, 2015 - 6:15am

During the life cycle of a Drupal project there are many situations when you need to do expensive operations. Examples of these are: populating a newly created field for thousands of entities, calling a webservice multiple times, downloading all of the images referenced by a certain text field, etc.

In this article, I will explain how you can organize these operations in order to avoid the pitfalls related to them. I created a GitHub repository with the code of every step in this article.

You will see how we can use update hooks, drush commands or Drupal queues to solve this. Depending on the situation you’ll learn to use one or the other.


The UX team at the Great Shows TV channel has come up with an idea to improve the user experience on their Drupal website. They are partnering with Nuts For TV, an online database with lots of reviews of TV show episodes, fan art, etc. The idea is that whenever an episode is created –or updated– in the Great Shows website, all the information available will be downloaded from a specific URL stored in Drupal fields. Also, they have gone ahead and manually updated all of the existing episode nodes to add the URL to the new field_third_party_uri field.

Your job as the lead back-end developer at Great Shows is to import the episode information from Nuts For TV. After writing the preceptive hook_entity_presave that will call _bg_process_perform_expensive_task, you end up with hundreds of old episode nodes that need to be processed. Your first approach may be to write an update hook to loop through the episode content and run _bg_process_perform_expensive_task.

The example repo focuses on the strategies to deal with massive operations. All the code samples are written for educational purposes, and not for their direct use.

Time expensive operations are many times expensive in memory resources as well. You want to avoid the update hook to fail because the available memory has been exhausted.

Do not run out of memory

With an update hook you can have the code deployed to every environment and run database updates as part of your deploy process. This is the approach taken in the first step in the example repo. You will take the entities that have the field_third_party_uri attached to them and process them with _bg_process_perform_expensive_task.

/** * Update from a remote 3rd party web service. */ function bg_process_update_7100() { // All of the entities that need to be updated contain the field. $field_info = field_info_field(FIELD_URI); // $field_info['bundles'] contains information about the entities and bundles // that have this particular field attached to them. $entity_list = array(); // Populate $entity_list // Something like: // $entity_list = array( // array('entity_type' => 'node', 'entity_id' => 123), // array('entity_type' => 'node', 'entity_id' => 45), // ); // Here is where we process all of the items: $succeeded = $errored = 0; foreach ($entity_list as $entity_item) { $success = _bg_process_perform_expensive_task($entity_item['entity_type'], $entity_item['entity_id']); $success ? $succeeded++ : $errored++; } return t('@succeeded entities were processed correctly. @errored entities failed.', array( '@succeeded' => $sandbox['succeeded'], '@errored' => $sandbox['errored'], )); }

This is when you realize that the update hook never completes due to memory issues. Even if it completes in your local machine, there is no way to guarantee that it will finish in all of the environments in which it needs to be deployed. You can solve this using batch update hooks. So that's what we are going to do in Step 2.

Running updates in batches

There is no exact way of telling when you will need to perform your updates in batches, but if you answer any of these questions with a yes, then you should do batches:

  • Did the single update run out of memory in your local?
  • Did you wonder if the update was dead when running a single batch?
  • Are you loading/updating more than 20 entities at a time?

While these provide a good rule of thumb, every situation deserves to be evaluated separately.

When using batches, your episodes update hook will transform into:

/** * Update from a remote 3rd party web service. * * Take all the entities that have FIELD_URI attached to * them and perform the expensive operation on them. */ function bg_process_update_7100(&$sandbox) { // Generate the list of entities to update only once. if (empty($sandbox['entity_list'])) { // Size of the batch to process. $batch_size = 10; // All of the entities that need to be updated contain the field. $field_info = field_info_field(FIELD_URI); // $field_info['bundles'] contains information about the entities and bundles // that have this particular field attached to them. $entity_list = array(); foreach ($field_info['bundles'] as $entity_type => $bundles) { $query = new \EntityFieldQuery(); $results = $query ->entityCondition('entity_type', $entity_type) ->entityCondition('bundle', $bundles, 'IN') ->execute(); if (empty($results[$entity_type])) { continue; } // Add the ids with the entity type to the $entity_list array, that will be // processed later. $ids = array_keys($results[$entity_type]); $entity_list += array_map(function ($id) use ($entity_type) { return array( 'entity_type' => $entity_type, 'entity_id' => $id, ); }, $ids); } $sandbox['total'] = count($entity_list); $sandbox['entity_list'] = array_chunk($entity_list, $batch_size); $sandbox['succeeded'] = $sandbox['errored'] = $sandbox['processed_chunks'] = 0; } // At this point we have the $sandbox['entity_list'] array populated: // $entity_list = array( // array( // array('entity_type' => 'node', 'entity_id' => 123), // array('entity_type' => 'node', 'entity_id' => 45), // ), // array( // array('entity_type' => 'file', 'entity_id' => 98), // array('entity_type' => 'file', 'entity_id' => 640), // array('entity_type' => 'taxonomy_term', 'entity_id' => 74), // ), // ); // Here is where we process all of the items: $current_chunk = $sandbox['entity_list'][$sandbox['processed_chunks']]; foreach ($current_chunk as $entity_item) { $success = _bg_process_perform_expensive_task($entity_item['entity_type'], $entity_item['entity_id']); $success ? $sandbox['succeeded']++ : $sandbox['errored']++; } // Increment the number of processed chunks to see if we finished. $sandbox['processed_chunks']++; // When we have processed all of the chunks $sandbox['#finished'] will be 1. // Then the update runner will consider the job finished. $sandbox['#finished'] = $sandbox['processed_chunks'] / count($sandbox['entity_list']); return t('@succeeded entities were processed correctly. @errored entities failed.', array( '@succeeded' => $sandbox['succeeded'], '@errored' => $sandbox['errored'], )); }

Note how the $sandbox array will be shared among all the batch iterations. That is how you can detect that this is the first iteration –by doing empty($sandbox['entity_list'])– and how you signal Batch API that the update is done. The $sandbox is also used to keep track of the chunks that have been processed already.

By running your episode updates in batches your next release will be safer, since you will have decreased the chances of memory issues. At this point, you observe that this next release will take two extra hours because of these operations running as part of the deploy process. You decide that you will write a drush command that will take care of updating all your episodes, that will decouple the data import from the deploy process.

Writing a custom drush command

With a custom drush command you can run your operations in every environment, and you can do it at any time and as many times as you need. You have decided to create this drush command so Matt (the release manager at Great Shows) can run it as part of the production release. That way he can create a release plan that is not blocked by a 2 hours update hook.

Drush runs in your terminal, and that means that it will be running under PHP CLI. This allows you to have different configurations to run your drush commands, without affecting your web server. Thus, can set a very high memory limit for PHP CLI to run your expensive operations. Check out Karen Stevenson’s article to test your custom drush commands with different drush versions.

To create a drush command from our original update hook in Step 1 we just need to create the drush file and implement the following methods:

  • hook_drush_command declares the command name and options passed to it.
  • drush_{MODULE}_{COMMANDNAME}. This is the main callback function, the action will happen here.

This results in:

/** * Main command callback. * * @param string $field_name * The name of the field in the entities to process. */ function drush_bg_process_background_process($field_name = NULL) { if (!$field_name) { return; } // All of the entities that need to be updated contain the field. $field_info = field_info_field($field_name); $entity_list = array(); foreach ($field_info['bundles'] as $entity_type => $bundles) { // Some of the code has been omitted for brevity’s sake. See the example repo // for the complete code. // At this point we have the $entity_list array populated. // Something like: // $entity_list = array( // array('entity_type' => 'node', 'entity_id' => 123), // array('entity_type' => 'file', 'entity_id' => 98), // ); // Here is where we process all of the items: $succeeded = $errored = 0; foreach ($entity_list as $entity_item) { $success = _bg_process_perform_expensive_task($entity_item['entity_type'], $entity_item['entity_id']); $success ? $succeeded++ : $errored++; } }

Some of the code above has been omitted for brevity’s sake. Please look at the complete example.

After declaring the drush command there is almost no difference between the update hook in Step 1 and this drush command.

With this code in place, you will have to run drush background-process field_third_party_uri in an environment to be able to QA the updated episodes. Drush also introduces some additional flexibility.

As the dev lead for Great Shows, you know that even though you can configure PHP CLI separately, you still want to run your drush command in batches. That will save some resources and will not rely on the PHP memory_limit setting.

A batch drush command

The transition to a batch drush command is also straightforward. The callback for the command will be responsible for preparing the batches. A new function will be written to deal with every batch, which will be very similar to our old command callback.

Looking at the source code for the batch command you can see how drush_bg_process_background_process is reponsible for getting the array of pairs containing entity types and entity IDs for all of the entities that need to be updated. That array is then chunked, so every batch will only process one of the chunks.

The last step is creating the operations array. Every item in the array will describe what needs to be done for every batch. With the operations array populated we can set some extra properties to the batch, like a callback that runs after all batches, and a progress message.

The drush command to add the extra data to the episodes uses two helper functions in order to have more readable code. _drush_bg_callback_get_entity_list is a helper function that will find all of the episodes that need to be updated, and return the entity type and entity ID pairs. _drush_bg_callback_process_entity_batch will update the episodes in the batch.

It is common to need to run a callback on a list of entities in a batch drush command.  Entity Process Callback is a generic drush command that lets you select the entities to be updated and apply a specified callback function to them. With that you only need to write a function that takes an entity type and an entity object and pass the name of that function to drush epc node _my_custom_callback_function. For our example, all the drush code is simplified to:

/** * Helper function that performs an expensive operation for EPC. */ function _my_custom_callback_function($entity_type, $entity) { list($entity_id,,) = entity_extract_ids($entity_type, $entity); _bg_process_perform_expensive_task($entity_type, $entity_id); }

Running drush batch commands is a very powerful and flexible way of executing your expensive back-end operations. However, it will run all of the operations sequentially in a single run. If that becomes a problem you can leverage Drupal’s built-in queue system.

Drupal Queues

Sometimes you don’t care if your operations are executed immediately, you only need to execute the operations at some point in the near future. In those cases, you may use Drupal queues.

Instead of updating the episodes immediately, there will be an operation per episode waiting in the queue to be executed. Each one of those operations will update an episode. All of the episodes will be updated only when all of the queue items –the episode update operations– have been processed.

You will only need an update hook to insert a queue item to the queue with all the information for the episode to be updated later. First, create the new queue that will hold the episode information. Then, insert the entity type and entity ID in the queue.

At this point you have created a queue and inserted a bunch of entity type and ID pairs, but there is nothing that is processing those items. To fix that you need to implement hook_cron_queue_info so queue elements get processed during cron runs. The 'worker callback' key holds the function that is executed for every queue item. Since we have been inserting an array for the queue item, that is what _bg_process_execute_queue_item –your worker callback– will receive as an argument. All that your worker needs to do is to execute the expensive operation.

There are several ways to process your queue items.

  • Drupal core ships with the aforementioned cron processing. This is the basic method, and the one used by Great Shows for their episode updates.
  • Similar to that, drush comes with drush queue-list and drush queue-run {queue name} to trigger your cron queues manually.
  • Fellow Lullabot Dave Reid wrote Concurrent Queue to process your queue operations in parallel and decrease the execution time.
  • The Advanced Queue module will give you extra niceties for your queues.
  • Another alternative is Queue Runner. This daemon will be monitoring your queue to process the items as soon as possible.

There are probably even more ways to deal with the queue items that are not listed here.


In this article, we started with a very naive update hook to execute all of our expensive operations. Resource limitations made us turn that into a batch update hook. If you need to detach these operations from the release process, you can turn your update hooks into a drush command or a batch drush command. A good alternative to that is to use Drupal’s queue system to prepare your operations and execute them asynchronously in the (near) future.

Some tasks will be better suited for one approach than others. Do you use other strategies when dealing with this? Share them in the comments!

Red Crackle: Adding fields and metadata to product

Drupal News - September 28, 2015 - 5:47am
This is tutorial #3 in the Drupal Commerce tutorial series. In this post, you will learn how to add fields and other metadata to the Drupal Commerce Product. In this specific example, we'll add description and image fields. This information will be exposed on the product display page so that it appears next to Add to card button. We will also enable user reviews and ratings.

Drupalize.Me: Meet Project Manager Alice Jensen

Drupal News - September 28, 2015 - 5:14am

We interview Alice Jensen about what it means to be a project manager and share advice from her experience. Copenhagen-based Project Manager (PM) Alice Jensen has been Drupaling since 2012. Her coworkers describe her with affection, using words such as "fearless", "calm", and "passionate". Read more about Alice's approach to her job as a project manager in this Drupalize.Me interview, part of our Drupal roles series.

Dries Buytaert: Acquia raises $55 million series G

Drupal News - September 28, 2015 - 4:59am

Today, we're excited to announce that Acquia has closed a $55 million financing round, bringing total investment in the company to $188.6 million. Led by new investor Centerview Capital Technology, the round includes existing investors New Enterprise Associates (NEA) and Split Rock Partners.

We are in the middle of a big technological and economic shift, driven by the web, in how large organizations and industries operate. At Acquia, we have set out to build the best platform for helping organizations run their businesses online, help them invent new ways of doing business, and maximize their digital impact on the world. What Acquia does is not at all easy -- or cheap -- but we've made good strides towards that vision. We have become the backbone for many of the world's most influential digital experiences and continue to grow fast. In the process, we are charting new territory with a very unique business model rooted Drupal and Open Source.

A fundraise like this helps us scale our global operations, sales and marketing as well as the development of our solutions for building, delivering and optimizing digital experiences. It also gives us flexibility. I'm proud of what we have accomplished so far, and I'm excited about the big opportunity ahead of us.

ERPAL: How Drupalcon helped us to improve

Drupal News - September 28, 2015 - 12:14am

This Drupalcon in Barcelona was really special for us because we had so many touching points with the community and other businesses in the Drupal ecosystem. This Drupalcon really helped us to increase our Drupal8 knowledge but also our whole business. In this blog post I want to share how the community helped us to win more ideas to improve our business in the near future.

We went to Barcelona to get in personal contact with all the amazing people in the Drupal community. Talking to some contributors of ERPAL, we got lots of new ideas about how to make ERPAL Platform better, more flexible and focus on specific business use cases when building ERPAL for Drupal 8. Also we got some very valuable information on how to improve Drop Guard to automate the processing of Drupal updates with integration into individual development workflows.

Talking for example to Bojan of the commerce guys about how to make Drupal Commerce even more flexible and suitable for B2B projects in Drupal 8, we saw the power of personal connections of community members the bring new ideas to live. Thanks for the time he to listen to our ideas. While maintaining ERPAL Platform since the last year as a Drupal distribution to build flexible business applications, we realized that it is not that easy to build distributions for concrete use cases that provide users the same flexibility that they know and expect from Drupal itself. There was also a full session about our experience when building Drupal distributions. The discussions after this session showed us that there is big interest in the Drupal Community to build niche products using Drupal distributions. Nevertheless there are many challenges to tackle which have been covered in our talk at Drupalcon. See the full session recorded below.

The new architecture of Drupal 8 gives us the chance to re-architecture ERPAL Platform to be laveraged by the flexibility of Drupal commerce and provide even more flexible features to manage business processes than today. To make this project become a success in Drupal 8 we want to cooperate closer with other project maintainers and use our network of contributers and Drupal passionate people to create a Drupal 8 distribution that is a solid and flexible base to build SaaS businesses based on Drupal. What makes the Drupal community special is both, the open software that people develop and many people having an open mind to solve real world problems with Drupal.

As we got more then 100 new people interested in joining our beta test group for Drop Guard and some onboardings already took place at Drupalcon, we saw the many different ways people build, operate and maintain their Drupal sites. This gave us the chance to make Drop Guard accessible for much more people by supporting drush make files, composer and submodules. Talking to some members of the Drupal.org security team we got the confirmation that a system like Drop Guard, to make the Drupal update process much easier, faster and more relyable can have a huge positive impact on the security of the Drupal ecosystem. Nevertheless we realized that some people are afraid of automated updates as they have concerns that the functionality of their sites will break. That's why we also had some productive meetings with hosting platforms such as Acquia Cloud, pantheon.io and platform.sh to discuss how an integration with hosting platforms and testing systems can reduce the risk of broken sites. I am looking forward to integrate Drop Guard with external services to let automated updates increase the security of the Drupal ecosystem.

Another thing we've learned is that the health of Drupal businesses depends on the release cycle of Drupal. Dries also mentioned this in his key note as many companies are waiting with their new project for the release of Drupal 8. What we realized is, that this Drupalcon had much less sponsors and less attendes compared to the last Drupalcon of Europe in Amsterdam 2014. This shows that Drupalcon is not only about community but also a lot about business. That's the reason why many Drupal shops try to get more independend of their project business by growing monthly recurring revenue. We realized that providing Drop Guard as a white label service for Drupal shops can help other business to start their way on the road of recurring revenue without huge investments. Selling update support is the most obvious service that produces recurring revenue and adds a real value to customers of Drupal development shops. You can watch the whole session about how to grow recurring revenue as a Drupal shop below.

Now we are excited to attend the next european Drupalcon in Dubline, hopefully to see Drupal 8 fully released at this time ;-)

If some of you couldn't attend Drupalcon and didn't have the chance to win a free Drop Guard procted site, you can attend our survey to join the free beta user group.

agoradesign: Add custom menu item attributes in Drupal 8

Drupal News - September 27, 2015 - 9:26am
We're currently working on a Drupal 8 project, where we need the possibility of adding class attributes to menu items. In D7, one would probably choose the menu attributes module to accomplish this. But unfortunately, there is currently no D8 port available. In this quick tutorial I'll show you, how to create your own tiny module to solve this problem.

Web Omelette: Webomelette.com gets a new face!

Drupal News - September 27, 2015 - 2:49am

If you are a reader of Webomelette.com you probably know it's been a while since any love has been given to this website. I decided recently to right this wrong and release a refreshed version. Lo and behold, the new version of Web Omelette!

I think it looks a bit fresher, crisper and should be a bit more performant as well. Additionally, I fixed some of the problems we had with copying code fragments by using the SyntaxHighlighter plugin for displaying code fences. This should also make them a bit more readable.

Moreover, you'll notice in the listing also external articles I've written for other websites. Feel free to check out those write-ups as well.

Please let me know if you encounter any bugs or issues with the website. I'd very much appreciate that.

J-P Stacey: My Drupalcon Barcelona notes

Drupal News - September 27, 2015 - 2:34am

As with previous years, I maintained notes on a publicly available Google Doc during Drupalcon Barcelona. I've tidied them up below for future reference, and added links to the recorded sessions where possible.

NB: these are rough notes made during talks. Please take with a pinch of salt.

Read more of "My Drupalcon Barcelona notes"

DrupalCon News: Thank you for coming to DrupalCon Barcelona

Drupal News - September 26, 2015 - 6:19am

Thank you so much for attending DrupalCon Barcelona.  We had an amazing time and hope that you did too.  

After each Con, we ask that you please let us know how it went so we can see what we can improve for next time.  Please

Fill Out the Survey

nielsdefeyter.nl: Watch Drupalcon Barcelona 2015 sprinting on drupal.org

Drupal News - September 26, 2015 - 4:43am
One little Drupal community secret: the main-event of a DrupalCon might be sprinting... A code sprint is getting developers for a set amount of time – usually one to two days – and just writing code. That's it. You're not teaching anything. Participants will learn from others as they go, but the...

nielsdefeyter.nl: Watch Drupalcon Barcelona 2015 sessions on Youtube

Drupal News - September 26, 2015 - 4:34am
More than 125+ sessions from last week's Drupalcon in Barcelona are on Youtube: https://www.youtube.com/user/DrupalAssociation/videos Posted by the Drupal Association. Hope you can learn from it too! Session tracks Business and Strategy Business Showcase Coding and Development Content Strategy Core...

Drupal core announcements: PHPTemplate will be removed from Drupal 8 core

Drupal News - September 26, 2015 - 4:27am

Since 2013, PHPTemplate is no longer used in Drupal 8 core and has been replaced by the Twig theme engine.

In 2014, we enabled Twig's autoescape feature in Drupal 8 to provide a more secure foundation for themers. To take full advantage of this feature, core relies on Twig to perform the final escaping of many variables. PHPTemplate is not compatible with this approach, currently insecure, and no longer supported, so it will be removed from Drupal 8 core.

Most Drupal themes that used PHPTemplate in Drupal 7 should be updated to use Twig. Drupal core will also still support multiple theme engines, but alternate theme engines will also need to provide some means of escaping unsafe output or risk security vulnerabilities.

Issue where this change is under discussion: Remove PHPTemplate, and add test coverage for multiple theme engine support

Cocomore: DrupalCon Barcelona 2015 – the last days

Drupal News - September 25, 2015 - 2:00pm

On the last day, many sessions were held and a keynote took place early in the morning again. Initially, David Rozas made a presentation on Community Contribution and following this, Mike Bell gave a moving speech about Mental Health.

Syndicate content