Category: Tutorial

Converting comma separated fields to MySQL JSON – a case study

This post is a case study of a job I had to do in a legacy application, it doesn’t mean it will apply to you, but it might.

This is a table of contents:

There are many ways to store a tree in a relational database, this is not by far the best option to do it, however it is still common to see it happen.

One way it is called materialized path, which consist with values separated by a delimiter, in my case a comma ,.

You would have a tree stored in a manner like this:

id parent_id user_id depth asc_path node
1 0 1 1
2 1 2 2 ,1, L
3 1 13 2 ,1, R
4 2 3 3 ,2,1, L
5 2 61 3 ,2,1, R
6 13 23 3 ,13,1, L
7 13 22 3 ,13,1, R
8 3 4 4 ,3,2,1, L
9 3 156 4 ,3,2,1, R
10 22 1568 4 ,22,13,1, L
11 22 26 4 ,22,13,1, R
12 23 1476 4 ,23,13,1, L
13 23 690716 4 ,23,13,1, R
14 61 1051 4 ,61,2,1, L
15 61 62 4 ,61,2,1, R

The column asc_path stands for ascending path of a tree in where which node has two other ones, not necessarily being a binary tree, being stored in the database.

This column has commas in the beginning and in the end because how queries are made to search if an element is present in the path or not by using LIKE "%,id,%". If someone did a search to know if the number 2 was a node in any of the paths, without the commas, it would also return 23, 62 and any other number containing 2.

Performance

The only way to make it a bit faster is having a FULLTEXT index created in asc_path. Because a BTREE index starts indexing in the beginning of a string, since the presence of the wildcard % in the string search it makes it in possible to use said index.

This is the graphical representation of the example above:

Tree

Searching

To search an specific element the query would be:

SELECT
parent_id,
user_id,
depth,
asc_path,
node
FROM tree
WHERE asc_path LIKE '%,13,%';

Result:

parent_id user_id depth asc_path node
13 23 3 ,13,1, L
13 22 3 ,13,1, R
22 1568 4 ,22,13,1, L
22 26 4 ,22,13,1, R
23 1476 4 ,23,13,1, L
23 690716 4 ,23,13,1, R

Converting to a JSON array

Some databases, like PostgresSQL (section 9.42) have more modifiers functions to convert strings to JSON, in my case I wanted to store the ascending tree path in a JSON field which would give me the possibility of using JSON_CONTAINS(json_doc, val) to know the records that have a given node in its path.

To do it, I had to transform the string in a JSON array.

1st step: remove the leading commas

Removing the leading commas, but before any update, lets test what we are doing:

SELECT
parent_id,
user_id,
depth,
asc_path,
TRIM(BOTH ',' FROM asc_path) AS trimmed_commas
FROM tree

Results:

parent_id user_id depth asc_path trimmed_commas
0 1 1
1 2 2 ,1, 1
1 13 2 ,1, 1
2 3 3 ,2,1, 2,1
2 61 3 ,2,1, 2,1
13 23 3 ,13,1, 13,1
13 22 3 ,13,1, 13,1
3 4 4 ,3,2,1, 3,2,1
3 156 4 ,3,2,1, 3,2,1
22 1568 4 ,22,13,1, 22,13,1

2nd step: add brackets to the string

A JSON array is formed around brackets [], and we need to have it in our string to be a valid JSON document:

SELECT
parent_id,
user_id,
depth,
asc_path,
TRIM(BOTH ',' FROM asc_path) AS trimmed_commas,
CONCAT("[", TRIM(BOTH ',' FROM asc_path), "]") AS added_brackets
FROM tree;

Results:

parent_id user_id depth asc_path trimmed_commas added_brackets
0 1 1
1 2 2 ,1, 1 [1]
1 13 2 ,1, 1 [1]
2 3 3 ,2,1, 2,1 [2,1]
2 61 3 ,2,1, 2,1 [2,1]
13 23 3 ,13,1, 13,1 [13,1]
13 22 3 ,13,1, 13,1 [13,1]
3 4 4 ,3,2,1, 3,2,1 [3,2,1]
3 156 4 ,3,2,1, 3,2,1 [3,2,1]
22 1568 4 ,22,13,1, 22,13,1 [22,13,1]

3rd step: validate if the changes works

Let’s use JSON_VALID() to see if it will accept our new string as a JSON, keep in mind that when the argument is NULL the return is also NULL:

SELECT
parent_id,
user_id,
depth,
asc_path,
TRIM(BOTH ',' FROM asc_path) AS trimmed_commas,
CONCAT("[", TRIM(BOTH ',' FROM asc_path), "]") AS added_brackets,
JSON_VALID(CONCAT("[", TRIM(BOTH ',' FROM asc_path), "]")) AS json_valid
FROM tree;

Results:

parent_id user_id depth asc_path trimmed_commas added_brackets json_valid
0 1 1
1 2 2 ,1, 1 [1] 1
1 13 2 ,1, 1 [1] 1
2 3 3 ,2,1, 2,1 [2,1] 1
2 61 3 ,2,1, 2,1 [2,1] 1
13 23 3 ,13,1, 13,1 [13,1] 1
13 22 3 ,13,1, 13,1 [13,1] 1
3 4 4 ,3,2,1, 3,2,1 [3,2,1] 1
3 156 4 ,3,2,1, 3,2,1 [3,2,1] 1
22 1568 4 ,22,13,1, 22,13,1 [22,13,1] 1
22 26 4 ,22,13,1, 22,13,1 [22,13,1] 1
23 1476 4 ,23,13,1, 23,13,1 [23,13,1] 1
23 690716 4 ,23,13,1, 23,13,1 [23,13,1] 1
61 1051 4 ,61,2,1, 61,2,1 [61,2,1] 1
61 62 4 ,61,2,1, 61,2,1 [61,2,1] 1

Replacing 1st step and 2nd step with a function

So that your query gets easier to use and not messy, you can create a function, I decided to create to_json_array(input_string, delimiter_char):

Running the query only with to_json_array on MySQL:

SELECT
parent_id,
user_id,
depth,
asc_path,
to_json_array(asc_path, ',') AS to_json_array,
JSON_VALID(to_json_array(asc_path, ',')) AS is_to_json_array_valid,
node
FROM tree;

Result:

parent_id user_id depth asc_path to_json_array is_to_json_array_valid node
0 1 1
1 2 2 ,1, [1] 1 L
1 13 2 ,1, [1] 1 R
2 3 3 ,2,1, [2, 1] 1 L
2 61 3 ,2,1, [2, 1] 1 R
13 23 3 ,13,1, [13, 1] 1 L
13 22 3 ,13,1, [13, 1] 1 R
3 4 4 ,3,2,1, [3, 2, 1] 1 L
3 156 4 ,3,2,1, [3, 2, 1] 1 R
22 1568 4 ,22,13,1, [22, 13, 1] 1 L

Disclaimer

This function is not native, and its use in production is not guaranteed.

Notice that the database returns the JSON as valid making it possible to convert that TEXT to a new column asc_path_json:

ALTER TABLE tree
ADD COLUMN asc_path_json JSON
AFTER asc_path;

UPDATE tree
SET asc_path_json = to_json_array(asc_path, ',');

Which gives us the ability to check more quickly if an item is in the path for that node:

SELECT *
FROM tree
WHERE json_contains(asc_path_json, "13");

Result:

id parent_id user_id depth asc_path asc_path_json node
6 13 23 3 ,13,1, [13, 1] L
7 13 22 3 ,13,1, [13, 1] R
10 22 1568 4 ,22,13,1, [22, 13, 1] L
11 22 26 4 ,22,13,1, [22, 13, 1] R
12 23 1476 4 ,23,13,1, [23, 13, 1] L
13 23 690716 4 ,23,13,1, [23, 13, 1] R

Fast data import trick

A few weeks ago my friend Frank de Jonge told me he managed to improve an import into a MySQL server down from more than 10 hours to 16 minutes. According to him it had to do with several field types (too long fields to really small data), the amount of indexes, and constraints on the tables. We were talking about 1 million records here. He wondered if it was possible to make it even faster.

The basics

Turns out there are many ways of importing data into a database, it all depends where are you getting the data from and where you want to put it. Let me give you a bit more context: you may want to get data from a legacy application that exports into CSV to your database server or even data from different servers.

If you are pulling data from a MySQL table into another MySQL table (lets assume they are into different servers) you might as well use mysqldump.

To export a single table:

$ mysqldump -h localhost -u root -p --extended-insert --quick --no-create-info mydb mytable | gzip > mytable.sql.gz

A bit more about this line:

  • --extended-insert: it makes sure that it is not one INSERT per line, meaning a single statement can have dozens of rows.
  • --quick: useful when dumping large tables, by default MySQL reads the whole table in memory then dumps into a file, that way the data is streamed without consuming much memory.
  • --no-create-info: this means only the data is being exported, no CREATE TABLE statements will be added

The complex

The problem my friend faced was a bit more complex. He needed to generate the dump file due to the source of his data coming from somewhere else (Later on I advised him on the benefits of LOAD FILE), but since 90% of his work was already done he wanted to know:

Why when I do blocks of 50 rows to be inserted is it faster then when I do with 500?

There could be N reasons for that:

  • buffering 500 rows into memory is slower than 50, remember, you are reading from the disk, it is always slow.
  • if no transactions are used, the indexes gets rebuilt after the end of each INSERT, to 1 million rows at a 50 values per statement we have 20k INSERTs, while with 500 it would be 2k statements. My speculation here is that indexes in InnodB engine are BTREE, slowling building means that you “know” where the values are in the tree, so it’s a fast search to sort and organise while with 500 items you need to reorganise a lot of information at once. Again, this is an speculation.

Suggestions

Transactions

My first suggestion was: wrap everything in a single transaction. Put a START TRANSACTION in the beginning and at the end a COMMIT statement. That way you do the rebuilding of the indexes and foreign key checks at the end of the script.

He reported a minor improvement on performance.

The Danger

I knew from the begining a way where his import would be really fast, but since the source of his data wasn’t as secure as the database itselft it could result in duplicated data, missing foreign keys, it could end really really bad.

MySQL by default when you use mysqldump put this option in place because it’s fair to assume you are going to be importing this to an empty database, so no data integrity problems. Which wasn’t the case.

The data was manipulated to be inserted, so the trick I said to him was and I quote:

SET foreign_key_checks = 0;
/* do you stuff REALLY CAREFULLY */
SET foreign_key_checks = 1;

The import went from 16 min to 6 min. He got super happy 😀:

And people on the internet got curious (because Frank is famous now, apparently):

I confess it was fun to see the time cut down and more than half, but use with caution.

An even more faster way

CSV files. Yes, that’s faster. Specifically TSV, since any string can have a comma.

To generate:

$ mysqldump -h localhost -u root -p --tab=/tmp mydb mytable

Or if you are manipulating the data yourself from another source, don’t forget to use \N for NULL values.

To Read:

$ mysql -h localhost -u root -p
mysql> LOAD DATA INFILE '/tmp/mytable.txt' INTO TABLE mytable;
Query OK, 881426 rows affected (29.30 sec)
Records: 881426 Deleted: 0 Skipped: 0 Warnings: 0

The same data with bulk INSERTs took over a minute. There are many variables when dealing with that statement such as buffer size, the checking of the keys itself, so for high volume data importing straight from a text file is still the fastest option.

Conclusion

As I said before, it was just a matter of disabling the constraint check in the script. Only do that if you are sure the data is good, else, other options like net_buffer_length, max_allowed_packet and read_buffer_size can help you import big SQL files. Also in most cases this should be considered: Data Integrity > Performance.

Laravel with Grunt, Bower, Foundation and Sass

I needed to use Grunt and Bower with Laravel for a project. So, I did some digging, and found Harianto van Insulide tutorial. He used somethings similar with what I needed, so I followed his tutorial and made my own modifications. So this is the result!

Installing Dependencies

Composer

If you don’t have composer installed, just enter the following:

$ curl -sS https://getcomposer.org/installer | php

Ruby/Sass

If you use Windows , you need to download rubyhttps://www.ruby-lang.org/pt/downloads/.

For Mac OS X , ruby already comes installed.

For Linux machine, you can download via apt-get or yum.

So, from that just run:

$ gem install compass
$ gem install sass

Laravel

To install, you can download the latest version in a package or use the composer command:

$ composer create-project laravel/laravel --prefer-dist

NodeJS

You can get the installer from NodeJS website: http://nodejs.org/download/.


Grunt and Bower

Both can be installed only globally, so if you want them to be installed locally just remove the -g to the command bellow:

$ npm install -g grunt-cli
$ npm install -g bower

Grunt Initialization

Grunt is a task runner, it allows you to automate the tasks of sass compiling, javascript minification, CND upload and every other task you need. Grunt comes with a lot of plugins, so you can just browse on their site to find some awesome ones!

I recommend doing the following commands inside laravel folder, you can do in the project root, but this is a personal choice, just remember of updating the path if you are using a different one.
So as we do with git, we must do with grunt:

$ npm init

You can fill the fields as I did:

name : laravelTutorial
version : 0.1.0
description : This is a Laravel with Grunt, Bower, Foundation and Sass Tutorial made by @gabidavila: http://gabriela.io
entry point : Gruntfile.js
test command : <kbd>Enter</kbd>
git repository : git@github.com:gabidavila/laravel-grunt-bower-foundation.git
keywords : laravel, grunt, foundation, sass, bower
author : Gabriela D'Avila &lt;gabidavila&gt;
license : MIT

A new file will be generated, package.json, there will be the information about your grunt configuration. Most importantly, where it is the Gruntfile.js. Do not worry about the scripts/tests item, you can change it in the future.

{
"name": "laravelTutorial",
"version": "0.1.0",
"description": "This is a Laravel with Grunt, Bower, Foundation and Sass Tutorial made by @gabidavila: http://en.davila.blog.br",
"main": "Gruntfile.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"repository": {
"type": "git",
"url": "git remote add origin git@github.com:gabidavila/laravel-grunt-bower-foundation.git"
},
"keywords": [
"laravel",
"grunt",
"foundation",
"sass",
"bower"
],
"author": "Gabriela D'Avila <gabidavila>",
"license": "MIT",
"bugs": {
"url": "https://github.com/gabidavila/laravel-grunt-bower-foundation/issues"
},
"homepage": "https://github.com/gabidavila/laravel-grunt-bower-foundation"
}

Grunt Plugins

In this tutorial I’ll use some features of Grunt:

  • Concat: makes all the javascript into a single file (grunt-contrib-concat)
  • Uglify: minifies all the javascript (grunt-contrib-uglify)
  • PHPUnit: allows us to run PHP unit tests (grunt-phpunit)
  • Sass: to compile Sass (grunt-contrib-sass)
    To install is very simple, just run:
# --save-dev saves this plugins in package.json, so just keep them here 🙂

$ npm install grunt --save-dev
$ npm install grunt-contrib-concat --save-dev
$ npm install grunt-contrib-uglify --save-dev
$ npm install grunt-phpunit --save-dev
$ npm install grunt-contrib-compass --save-dev
$ npm install grunt-contrib-sass --save-dev

# OR a single command :O

$ npm install grunt grunt-contrib-concat grunt-contrib-uglify grunt-phpunit grunt-contrib-compass grunt-contrib-sass --save-dev

The --save-dev  argument includes the plugins in your package.json file.


Bower

Bower is a package manager, a better description would be a dependency manager for the frontend, just like Composer does for everything else. You can learn more here: http://bower.io/.

When we ask bower to download our dependency as default it adds a bower_components folder in the same path as the command was made. So, we need to adjust the path to be more organized.

Create a .bowerrc file in your Laravel root:

{
"directory": "public/assets/vendor"
}

This will inform to bower where to download the dependencies.

Installing Foundation and other dependencies

As we do with composer lets create a bower.json file. Initially just add your project name:

{
"name": "laravelTutorial"
}

This file will be updated as dependency are installed with the -S argument. To install foundation:

$ bower install foundation -S

Foundation already install some dependencies, like jQuery, modernizr and fastclick. If you look at your bower.json file you’ll see:

{
"name": "laravelTutorial",
"dependencies": {
"foundation": "~5.3.3"
}
}

And at the public/assets/vendor folder:

public/assets/vendor/
 |-- fastclick
 |-- foundation
 |-- jquery
 |-- jquery-placeholder
 |-- jquery.cookie
 `-- modernizr

I did some organization in the structure of javascript and css, I created the following folders and files:

public/
 |-- assets
 | |-- stylesheets
 | | |-- base.scss # base from foundation
 | | |-- custom.scss # for customs scss
 | | `-- variables.scss # for custom variables
 | `-- javascripts
 | `-- custom.js # for customs javascript
 |-- css
 `-- js

The content of the file  public/assets/stylesheets/base.scss is:

/*
* public/assets/stylesheets/base.scss
* Files from foundation
*/
@import 'variables';
@import '../vendor/foundation/scss/normalize';
@import '../vendor/foundation/scss/foundation';

The variables.scss file is a copy of _settings.scss, so you can customize your scss. You can see the content here.


Gruntfile.js

Now is the fun part. Here we will inform how we want grunt to behave. Where do we want it to put the compiled scss file, how to minify the files, etc.

This is a minimum Gruntfile.js template:

//Gruntfile
module.exports = function(grunt) {

//Initializing the configuration object
grunt.initConfig({

// Paths variables
paths: {
// Development where put SASS files, etc
assets: {
css: './public/assets/stylesheets/',
js: './public/assets/javascripts/',
vendor: './public/assets/vendor/'
},
// Production where Grunt output the files
css: './public/css/',
js: './public/js/'

},

// Task configuration
concat: {
//...
},
sass: {
//...
},
uglify: {
//...
},
phpunit: {
//...
},
watch: {
//...
}
});

// Plugin loading

// Task definition

};

Configuring SCSS compiling

In the sass section, add the following:

// Part of Gruntfile.js, this is the sass section
sass: {
css: {
options: {
style: 'compressed',
compass: true
},
// This will get all the scss files in /public/assets/stylesheets
files: [{
expand: true,
cwd: '<%= paths.assets.css %>',
src: '**/*.scss',
dest: '<%= paths.css %>',
ext: '.css',
}],
},
},

// This is at the end of the file, the plugin section:
// Plugin loading
grunt.loadNpmTasks('grunt-contrib-compass');
grunt.loadNpmTasks('grunt-contrib-sass');

Note that the end of the file we added the plugin loading for compass and sass. To see if this is working, just execute: grunt sass

The SCSS will be compiled into the public/css/base.css file.

The Javascript

Foundation uses some Javascript dependencies in different locations of the HTML file, so we have modernizr in the head section and jQuery at the footer section, so I think the best approach here is to separate these two, as js_header and js_footer. These two sections will get all the Javascript files and create respectively script_header.js and script_footer.js.

// Part of Gruntfile.js, this is the Concat section
concat: {
options: {
separator: ';',
},
js_header: {
src: [
'<%= paths.assets.vendor %>modernizr/modernizr.js',
'<%= paths.assets.js %>custom.js'
],
dest: '<%= paths.js %>expanded/scripts_header.js',
},
js_footer: {
src: [
'<%= paths.assets.vendor %>jquery/dist/jquery.js',
'<%= paths.assets.vendor %>jquery.cookie/jquery.cookie.js',
'<%= paths.assets.vendor %>jquery.placeholder/jquery.placeholder.js',
'<%= paths.assets.vendor %>fastclick/lib/fastclick.js',
'<%= paths.assets.vendor %>foundation/js/foundation.js'
],
dest: '<%= paths.js %>expanded/scripts_footer.js',
}
},

// This is at the end of the file, the plugin section:
// Plugin loading
grunt.loadNpmTasks('grunt-contrib-compass');
grunt.loadNpmTasks('grunt-contrib-sass');
grunt.loadNpmTasks('grunt-contrib-concat');

You can read more about Javascript files setup in Foundation here: http://foundation.zurb.com/docs/javascript.html.

To see if this is working, just execute: grunt concat, and the files will be added int /public/js directory, if you want to concat just the footer or the head section, use grunt concat:js_header or grunt concat:js_footer:

Files Generated by grunt concat

public
 `-- js
 `-- expanded
 |-- scripts_footer.js
 `-- scripts_header.js

Now, the javascript is all in single files as we defined, and to load faster we can minify them. The uglify task will be defined as bellow:

// Part of Gruntfile.js, this is the Uglify section
uglify: {
options: {
// Grunt can replace variables names, but may not be a good idea for you, I leave
// this option as false
mangle: false
},
js: {
// Grunt will search for "**/*.js" when the "minify" task
// runs and build the appropriate src-dest file mappings then, so you
// don't need to update the Gruntfile when files are added or removed.
files: [{
expand: true,
cwd: '<%= paths.js %>',
src: '**/*.js',
dest: '<%= paths.js %>min',
ext: '.min.js',
}],
}
},

// This is at the end of the file, the plugin section:
// Plugin loading
grunt.loadNpmTasks('grunt-contrib-compass');
grunt.loadNpmTasks('grunt-contrib-sass');
grunt.loadNpmTasks('grunt-contrib-concat');
grunt.loadNpmTasks('grunt-contrib-uglify');

After run grunt uglifygrunt will create a min folder inside /public/js with two files: script_header.min.js and script_footer.min.js.

Files Generated by grunt uglify

public
 `-- js
 |-- min
 | |-- scripts_footer.min.js
 | `-- scripts_header.min.js
 `-- expanded
 |-- scripts_footer.js
 `-- scripts_header.js

PHPUnit

So, if you work with PHPUnit, you can make Grunt run your tests for you. You just have to configure where the test classes are and the bin location. You may not have phpunit installed in your project, for that, add the following line in the composer.json file:

{
"require": {
"laravel/framework": "4.2.*",
"phpunit/phpunit": "4.*"
}
}

And run composer update to install the binary in /vendor/bin.

After, just add the phpunit in Gruntfile.js:

// Part of Gruntfile.js, this is the Uglify section
phpunit: {
classes: {
dir: 'app/tests/' //location of the tests
},
options: {
bin: 'vendor/bin/phpunit',
colors: true
}
},

// This is at the end of the file, the plugin section:
// Plugin loading
grunt.loadNpmTasks('grunt-contrib-compass');
grunt.loadNpmTasks('grunt-contrib-sass');
grunt.loadNpmTasks('grunt-contrib-concat');
grunt.loadNpmTasks('grunt-contrib-uglify');
grunt.loadNpmTasks('grunt-phpunit');

Running the tests: grunt phpunit

Bellow the output:

[![Gruntfile_js_-_laravel_-____PhpstormProjects_tutorial_laravel_](http://s3.davila.blog.br/wp-content/uploads/sites/2/2014/08/Gruntfile_js_-_laravel_-____PhpstormProjects_tutorial_laravel_1-1024x418.png){.img-responsive .img-thumbnail}](http://s3.davila.blog.br/wp-content/uploads/sites/2/2014/08/Gruntfile_js_-_laravel_-____PhpstormProjects_tutorial_laravel_1.png)

At last!

Until now we defined 4 tasks:

  • sass
  • concat
  • uglify
  • phpunit

For running all 4 tasks with a single command, you can define a task, I defined that when I run grunt in the terminal, it will run all 4 tasks:

// Plugin loading
grunt.loadNpmTasks('grunt-contrib-compass');
grunt.loadNpmTasks('grunt-contrib-sass');
grunt.loadNpmTasks('grunt-contrib-concat');
grunt.loadNpmTasks('grunt-contrib-uglify');
grunt.loadNpmTasks('grunt-phpunit');

// Task definition
grunt.registerTask('default', ['sass', 'concat', 'uglify', 'phpunit']);

The final Gruntfile.js:

//Gruntfile
module.exports = function (grunt) {

//Initializing the configuration object
grunt.initConfig({

// Paths variables
paths: {
// Development where put SASS files, etc
assets: {
css: './public/assets/stylesheets/',
js: './public/assets/javascripts/',
vendor: './public/assets/vendor/'
},
// Production where Grunt output the files
css: './public/css/',
js: './public/js/'

},

// Task configuration
concat: {
options: {
separator: ';',
},
js_header: {
src: [
'<%= paths.assets.vendor %>modernizr/modernizr.js',
'<%= paths.assets.js %>custom.js',
],
dest: '<%= paths.js %>expanded/scripts_header.js',
},
js_footer: {
src: [
'<%= paths.assets.vendor %>jquery/dist/jquery.js',
'<%= paths.assets.vendor %>jquery.cookie/jquery.cookie.js',
'<%= paths.assets.vendor %>jquery.placeholder/jquery.placeholder.js',
'<%= paths.assets.vendor %>fastclick/lib/fastclick.js',
'<%= paths.assets.vendor %>foundation/js/foundation.js'
],
dest: '<%= paths.js %>expanded/scripts_footer.js',
}
},
sass: {
css: {
options: {
style: 'compressed',
compass: true
},
files: [
{
expand: true,
cwd: '<%= paths.assets.css %>',
src: '**/*.scss',
dest: '<%= paths.css %>',
ext: '.css',
}
],
},
},
uglify: {
options: {
// Grunt can replace variables names, but may not be a good idea for you,
// I leave this option as false
mangle: false
},
js: {
// Grunt will search for "**/*.js" when the "minify" task
// runs and build the appropriate src-dest file mappings then, so you
// don't need to update the Gruntfile when files are added or removed.
files: [
{
expand: true,
cwd: '<%= paths.js %>',
src: '**/*.js',
dest: '<%= paths.js %>min',
ext: '.min.js',
}
],
}
},
phpunit: {
classes: {
dir: 'app/tests/' //location of the tests
},
options: {
bin: 'vendor/bin/phpunit',
colors: true
}
}
});

// Plugin loading
grunt.loadNpmTasks('grunt-contrib-compass');
grunt.loadNpmTasks('grunt-contrib-sass');
grunt.loadNpmTasks('grunt-contrib-concat');
grunt.loadNpmTasks('grunt-contrib-uglify');
grunt.loadNpmTasks('grunt-phpunit');

// Task definition
grunt.registerTask('default', ['sass', 'concat', 'uglify', 'phpunit']);
};

Conclusion

That’s it! It can be a little overwhelming having to do all of this, but it is worth it because you know can work in a organized way.

Leve a comment bellow if you have any questions.