mirror of
https://github.com/superseriousbusiness/gotosocial.git
synced 2025-10-29 07:02:26 -05:00
[bugfix] Fix Swagger spec and add test script (#2698)
* Add Swagger spec test script * Fix Swagger spec errors not related to statuses with polls * Add API tests that post a status with a poll * Fix creating a status with a poll from form params * Fix Swagger spec errors related to statuses with polls (this is the last error) * Fix Swagger spec warnings not related to unused definitions * Suppress a duplicate list update params definition that was somehow causing wrong param names * Add Swagger test to CI - updates Drone config - vendorizes go-swagger - fixes a file extension issue that caused the test script to generate JSON instead of YAML with the vendorized version * Put `Sample: ` on its own line everywhere * Remove unused id param from emojiCategoriesGet * Add 5 more pairs of profile fields to account update API Swagger * Remove Swagger prefix from dummy fields It makes the generated code look weird * Manually annotate params for statusCreate operation * Fix all remaining Swagger spec warnings - Change some models into operation parameters - Ignore models that already correspond to manually documented operation parameters but can't be trivially changed (those with file fields) * Documented that creating a status with scheduled_at isn't implemented yet * sign drone.yml * Fix filter API Swagger errors * fixup! Fix filter API Swagger errors --------- Co-authored-by: tobi <tobi.smethurst@protonmail.com>
This commit is contained in:
parent
68c8fe67cc
commit
fc3741365c
672 changed files with 135624 additions and 713 deletions
18
vendor/github.com/Masterminds/goutils/.travis.yml
generated
vendored
Normal file
18
vendor/github.com/Masterminds/goutils/.travis.yml
generated
vendored
Normal file
|
|
@ -0,0 +1,18 @@
|
|||
language: go
|
||||
|
||||
go:
|
||||
- 1.6
|
||||
- 1.7
|
||||
- 1.8
|
||||
- tip
|
||||
|
||||
script:
|
||||
- go test -v
|
||||
|
||||
notifications:
|
||||
webhooks:
|
||||
urls:
|
||||
- https://webhooks.gitter.im/e/06e3328629952dabe3e0
|
||||
on_success: change # options: [always|never|change] default: always
|
||||
on_failure: always # options: [always|never|change] default: always
|
||||
on_start: never # options: [always|never|change] default: always
|
||||
8
vendor/github.com/Masterminds/goutils/CHANGELOG.md
generated
vendored
Normal file
8
vendor/github.com/Masterminds/goutils/CHANGELOG.md
generated
vendored
Normal file
|
|
@ -0,0 +1,8 @@
|
|||
# 1.0.1 (2017-05-31)
|
||||
|
||||
## Fixed
|
||||
- #21: Fix generation of alphanumeric strings (thanks @dbarranco)
|
||||
|
||||
# 1.0.0 (2014-04-30)
|
||||
|
||||
- Initial release.
|
||||
202
vendor/github.com/Masterminds/goutils/LICENSE.txt
generated
vendored
Normal file
202
vendor/github.com/Masterminds/goutils/LICENSE.txt
generated
vendored
Normal file
|
|
@ -0,0 +1,202 @@
|
|||
|
||||
Apache License
|
||||
Version 2.0, January 2004
|
||||
http://www.apache.org/licenses/
|
||||
|
||||
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
|
||||
|
||||
1. Definitions.
|
||||
|
||||
"License" shall mean the terms and conditions for use, reproduction,
|
||||
and distribution as defined by Sections 1 through 9 of this document.
|
||||
|
||||
"Licensor" shall mean the copyright owner or entity authorized by
|
||||
the copyright owner that is granting the License.
|
||||
|
||||
"Legal Entity" shall mean the union of the acting entity and all
|
||||
other entities that control, are controlled by, or are under common
|
||||
control with that entity. For the purposes of this definition,
|
||||
"control" means (i) the power, direct or indirect, to cause the
|
||||
direction or management of such entity, whether by contract or
|
||||
otherwise, or (ii) ownership of fifty percent (50%) or more of the
|
||||
outstanding shares, or (iii) beneficial ownership of such entity.
|
||||
|
||||
"You" (or "Your") shall mean an individual or Legal Entity
|
||||
exercising permissions granted by this License.
|
||||
|
||||
"Source" form shall mean the preferred form for making modifications,
|
||||
including but not limited to software source code, documentation
|
||||
source, and configuration files.
|
||||
|
||||
"Object" form shall mean any form resulting from mechanical
|
||||
transformation or translation of a Source form, including but
|
||||
not limited to compiled object code, generated documentation,
|
||||
and conversions to other media types.
|
||||
|
||||
"Work" shall mean the work of authorship, whether in Source or
|
||||
Object form, made available under the License, as indicated by a
|
||||
copyright notice that is included in or attached to the work
|
||||
(an example is provided in the Appendix below).
|
||||
|
||||
"Derivative Works" shall mean any work, whether in Source or Object
|
||||
form, that is based on (or derived from) the Work and for which the
|
||||
editorial revisions, annotations, elaborations, or other modifications
|
||||
represent, as a whole, an original work of authorship. For the purposes
|
||||
of this License, Derivative Works shall not include works that remain
|
||||
separable from, or merely link (or bind by name) to the interfaces of,
|
||||
the Work and Derivative Works thereof.
|
||||
|
||||
"Contribution" shall mean any work of authorship, including
|
||||
the original version of the Work and any modifications or additions
|
||||
to that Work or Derivative Works thereof, that is intentionally
|
||||
submitted to Licensor for inclusion in the Work by the copyright owner
|
||||
or by an individual or Legal Entity authorized to submit on behalf of
|
||||
the copyright owner. For the purposes of this definition, "submitted"
|
||||
means any form of electronic, verbal, or written communication sent
|
||||
to the Licensor or its representatives, including but not limited to
|
||||
communication on electronic mailing lists, source code control systems,
|
||||
and issue tracking systems that are managed by, or on behalf of, the
|
||||
Licensor for the purpose of discussing and improving the Work, but
|
||||
excluding communication that is conspicuously marked or otherwise
|
||||
designated in writing by the copyright owner as "Not a Contribution."
|
||||
|
||||
"Contributor" shall mean Licensor and any individual or Legal Entity
|
||||
on behalf of whom a Contribution has been received by Licensor and
|
||||
subsequently incorporated within the Work.
|
||||
|
||||
2. Grant of Copyright License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
copyright license to reproduce, prepare Derivative Works of,
|
||||
publicly display, publicly perform, sublicense, and distribute the
|
||||
Work and such Derivative Works in Source or Object form.
|
||||
|
||||
3. Grant of Patent License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
(except as stated in this section) patent license to make, have made,
|
||||
use, offer to sell, sell, import, and otherwise transfer the Work,
|
||||
where such license applies only to those patent claims licensable
|
||||
by such Contributor that are necessarily infringed by their
|
||||
Contribution(s) alone or by combination of their Contribution(s)
|
||||
with the Work to which such Contribution(s) was submitted. If You
|
||||
institute patent litigation against any entity (including a
|
||||
cross-claim or counterclaim in a lawsuit) alleging that the Work
|
||||
or a Contribution incorporated within the Work constitutes direct
|
||||
or contributory patent infringement, then any patent licenses
|
||||
granted to You under this License for that Work shall terminate
|
||||
as of the date such litigation is filed.
|
||||
|
||||
4. Redistribution. You may reproduce and distribute copies of the
|
||||
Work or Derivative Works thereof in any medium, with or without
|
||||
modifications, and in Source or Object form, provided that You
|
||||
meet the following conditions:
|
||||
|
||||
(a) You must give any other recipients of the Work or
|
||||
Derivative Works a copy of this License; and
|
||||
|
||||
(b) You must cause any modified files to carry prominent notices
|
||||
stating that You changed the files; and
|
||||
|
||||
(c) You must retain, in the Source form of any Derivative Works
|
||||
that You distribute, all copyright, patent, trademark, and
|
||||
attribution notices from the Source form of the Work,
|
||||
excluding those notices that do not pertain to any part of
|
||||
the Derivative Works; and
|
||||
|
||||
(d) If the Work includes a "NOTICE" text file as part of its
|
||||
distribution, then any Derivative Works that You distribute must
|
||||
include a readable copy of the attribution notices contained
|
||||
within such NOTICE file, excluding those notices that do not
|
||||
pertain to any part of the Derivative Works, in at least one
|
||||
of the following places: within a NOTICE text file distributed
|
||||
as part of the Derivative Works; within the Source form or
|
||||
documentation, if provided along with the Derivative Works; or,
|
||||
within a display generated by the Derivative Works, if and
|
||||
wherever such third-party notices normally appear. The contents
|
||||
of the NOTICE file are for informational purposes only and
|
||||
do not modify the License. You may add Your own attribution
|
||||
notices within Derivative Works that You distribute, alongside
|
||||
or as an addendum to the NOTICE text from the Work, provided
|
||||
that such additional attribution notices cannot be construed
|
||||
as modifying the License.
|
||||
|
||||
You may add Your own copyright statement to Your modifications and
|
||||
may provide additional or different license terms and conditions
|
||||
for use, reproduction, or distribution of Your modifications, or
|
||||
for any such Derivative Works as a whole, provided Your use,
|
||||
reproduction, and distribution of the Work otherwise complies with
|
||||
the conditions stated in this License.
|
||||
|
||||
5. Submission of Contributions. Unless You explicitly state otherwise,
|
||||
any Contribution intentionally submitted for inclusion in the Work
|
||||
by You to the Licensor shall be under the terms and conditions of
|
||||
this License, without any additional terms or conditions.
|
||||
Notwithstanding the above, nothing herein shall supersede or modify
|
||||
the terms of any separate license agreement you may have executed
|
||||
with Licensor regarding such Contributions.
|
||||
|
||||
6. Trademarks. This License does not grant permission to use the trade
|
||||
names, trademarks, service marks, or product names of the Licensor,
|
||||
except as required for reasonable and customary use in describing the
|
||||
origin of the Work and reproducing the content of the NOTICE file.
|
||||
|
||||
7. Disclaimer of Warranty. Unless required by applicable law or
|
||||
agreed to in writing, Licensor provides the Work (and each
|
||||
Contributor provides its Contributions) on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
||||
implied, including, without limitation, any warranties or conditions
|
||||
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
|
||||
PARTICULAR PURPOSE. You are solely responsible for determining the
|
||||
appropriateness of using or redistributing the Work and assume any
|
||||
risks associated with Your exercise of permissions under this License.
|
||||
|
||||
8. Limitation of Liability. In no event and under no legal theory,
|
||||
whether in tort (including negligence), contract, or otherwise,
|
||||
unless required by applicable law (such as deliberate and grossly
|
||||
negligent acts) or agreed to in writing, shall any Contributor be
|
||||
liable to You for damages, including any direct, indirect, special,
|
||||
incidental, or consequential damages of any character arising as a
|
||||
result of this License or out of the use or inability to use the
|
||||
Work (including but not limited to damages for loss of goodwill,
|
||||
work stoppage, computer failure or malfunction, or any and all
|
||||
other commercial damages or losses), even if such Contributor
|
||||
has been advised of the possibility of such damages.
|
||||
|
||||
9. Accepting Warranty or Additional Liability. While redistributing
|
||||
the Work or Derivative Works thereof, You may choose to offer,
|
||||
and charge a fee for, acceptance of support, warranty, indemnity,
|
||||
or other liability obligations and/or rights consistent with this
|
||||
License. However, in accepting such obligations, You may act only
|
||||
on Your own behalf and on Your sole responsibility, not on behalf
|
||||
of any other Contributor, and only if You agree to indemnify,
|
||||
defend, and hold each Contributor harmless for any liability
|
||||
incurred by, or claims asserted against, such Contributor by reason
|
||||
of your accepting any such warranty or additional liability.
|
||||
|
||||
END OF TERMS AND CONDITIONS
|
||||
|
||||
APPENDIX: How to apply the Apache License to your work.
|
||||
|
||||
To apply the Apache License to your work, attach the following
|
||||
boilerplate notice, with the fields enclosed by brackets "[]"
|
||||
replaced with your own identifying information. (Don't include
|
||||
the brackets!) The text should be enclosed in the appropriate
|
||||
comment syntax for the file format. We also recommend that a
|
||||
file or class name and description of purpose be included on the
|
||||
same "printed page" as the copyright notice for easier
|
||||
identification within third-party archives.
|
||||
|
||||
Copyright [yyyy] [name of copyright owner]
|
||||
|
||||
Licensed under the Apache License, Version 2.0 (the "License");
|
||||
you may not use this file except in compliance with the License.
|
||||
You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing, software
|
||||
distributed under the License is distributed on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
See the License for the specific language governing permissions and
|
||||
limitations under the License.
|
||||
70
vendor/github.com/Masterminds/goutils/README.md
generated
vendored
Normal file
70
vendor/github.com/Masterminds/goutils/README.md
generated
vendored
Normal file
|
|
@ -0,0 +1,70 @@
|
|||
GoUtils
|
||||
===========
|
||||
[](https://masterminds.github.io/stability/maintenance.html)
|
||||
[](https://godoc.org/github.com/Masterminds/goutils) [](https://travis-ci.org/Masterminds/goutils) [](https://ci.appveyor.com/project/mattfarina/goutils)
|
||||
|
||||
|
||||
GoUtils provides users with utility functions to manipulate strings in various ways. It is a Go implementation of some
|
||||
string manipulation libraries of Java Apache Commons. GoUtils includes the following Java Apache Commons classes:
|
||||
* WordUtils
|
||||
* RandomStringUtils
|
||||
* StringUtils (partial implementation)
|
||||
|
||||
## Installation
|
||||
If you have Go set up on your system, from the GOPATH directory within the command line/terminal, enter this:
|
||||
|
||||
go get github.com/Masterminds/goutils
|
||||
|
||||
If you do not have Go set up on your system, please follow the [Go installation directions from the documenation](http://golang.org/doc/install), and then follow the instructions above to install GoUtils.
|
||||
|
||||
|
||||
## Documentation
|
||||
GoUtils doc is available here: [](https://godoc.org/github.com/Masterminds/goutils)
|
||||
|
||||
|
||||
## Usage
|
||||
The code snippets below show examples of how to use GoUtils. Some functions return errors while others do not. The first instance below, which does not return an error, is the `Initials` function (located within the `wordutils.go` file).
|
||||
|
||||
package main
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"github.com/Masterminds/goutils"
|
||||
)
|
||||
|
||||
func main() {
|
||||
|
||||
// EXAMPLE 1: A goutils function which returns no errors
|
||||
fmt.Println (goutils.Initials("John Doe Foo")) // Prints out "JDF"
|
||||
|
||||
}
|
||||
Some functions return errors mainly due to illegal arguements used as parameters. The code example below illustrates how to deal with function that returns an error. In this instance, the function is the `Random` function (located within the `randomstringutils.go` file).
|
||||
|
||||
package main
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"github.com/Masterminds/goutils"
|
||||
)
|
||||
|
||||
func main() {
|
||||
|
||||
// EXAMPLE 2: A goutils function which returns an error
|
||||
rand1, err1 := goutils.Random (-1, 0, 0, true, true)
|
||||
|
||||
if err1 != nil {
|
||||
fmt.Println(err1) // Prints out error message because -1 was entered as the first parameter in goutils.Random(...)
|
||||
} else {
|
||||
fmt.Println(rand1)
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
## License
|
||||
GoUtils is licensed under the Apache License, Version 2.0. Please check the LICENSE.txt file or visit http://www.apache.org/licenses/LICENSE-2.0 for a copy of the license.
|
||||
|
||||
## Issue Reporting
|
||||
Make suggestions or report issues using the Git issue tracker: https://github.com/Masterminds/goutils/issues
|
||||
|
||||
## Website
|
||||
* [GoUtils webpage](http://Masterminds.github.io/goutils/)
|
||||
21
vendor/github.com/Masterminds/goutils/appveyor.yml
generated
vendored
Normal file
21
vendor/github.com/Masterminds/goutils/appveyor.yml
generated
vendored
Normal file
|
|
@ -0,0 +1,21 @@
|
|||
version: build-{build}.{branch}
|
||||
|
||||
clone_folder: C:\gopath\src\github.com\Masterminds\goutils
|
||||
shallow_clone: true
|
||||
|
||||
environment:
|
||||
GOPATH: C:\gopath
|
||||
|
||||
platform:
|
||||
- x64
|
||||
|
||||
build: off
|
||||
|
||||
install:
|
||||
- go version
|
||||
- go env
|
||||
|
||||
test_script:
|
||||
- go test -v
|
||||
|
||||
deploy: off
|
||||
230
vendor/github.com/Masterminds/goutils/cryptorandomstringutils.go
generated
vendored
Normal file
230
vendor/github.com/Masterminds/goutils/cryptorandomstringutils.go
generated
vendored
Normal file
|
|
@ -0,0 +1,230 @@
|
|||
/*
|
||||
Copyright 2014 Alexander Okoli
|
||||
|
||||
Licensed under the Apache License, Version 2.0 (the "License");
|
||||
you may not use this file except in compliance with the License.
|
||||
You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing, software
|
||||
distributed under the License is distributed on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
See the License for the specific language governing permissions and
|
||||
limitations under the License.
|
||||
*/
|
||||
|
||||
package goutils
|
||||
|
||||
import (
|
||||
"crypto/rand"
|
||||
"fmt"
|
||||
"math"
|
||||
"math/big"
|
||||
"unicode"
|
||||
)
|
||||
|
||||
/*
|
||||
CryptoRandomNonAlphaNumeric creates a random string whose length is the number of characters specified.
|
||||
Characters will be chosen from the set of all characters (ASCII/Unicode values between 0 to 2,147,483,647 (math.MaxInt32)).
|
||||
|
||||
Parameter:
|
||||
count - the length of random string to create
|
||||
|
||||
Returns:
|
||||
string - the random string
|
||||
error - an error stemming from an invalid parameter within underlying function, CryptoRandom(...)
|
||||
*/
|
||||
func CryptoRandomNonAlphaNumeric(count int) (string, error) {
|
||||
return CryptoRandomAlphaNumericCustom(count, false, false)
|
||||
}
|
||||
|
||||
/*
|
||||
CryptoRandomAscii creates a random string whose length is the number of characters specified.
|
||||
Characters will be chosen from the set of characters whose ASCII value is between 32 and 126 (inclusive).
|
||||
|
||||
Parameter:
|
||||
count - the length of random string to create
|
||||
|
||||
Returns:
|
||||
string - the random string
|
||||
error - an error stemming from an invalid parameter within underlying function, CryptoRandom(...)
|
||||
*/
|
||||
func CryptoRandomAscii(count int) (string, error) {
|
||||
return CryptoRandom(count, 32, 127, false, false)
|
||||
}
|
||||
|
||||
/*
|
||||
CryptoRandomNumeric creates a random string whose length is the number of characters specified.
|
||||
Characters will be chosen from the set of numeric characters.
|
||||
|
||||
Parameter:
|
||||
count - the length of random string to create
|
||||
|
||||
Returns:
|
||||
string - the random string
|
||||
error - an error stemming from an invalid parameter within underlying function, CryptoRandom(...)
|
||||
*/
|
||||
func CryptoRandomNumeric(count int) (string, error) {
|
||||
return CryptoRandom(count, 0, 0, false, true)
|
||||
}
|
||||
|
||||
/*
|
||||
CryptoRandomAlphabetic creates a random string whose length is the number of characters specified.
|
||||
Characters will be chosen from the set of alpha-numeric characters as indicated by the arguments.
|
||||
|
||||
Parameters:
|
||||
count - the length of random string to create
|
||||
letters - if true, generated string may include alphabetic characters
|
||||
numbers - if true, generated string may include numeric characters
|
||||
|
||||
Returns:
|
||||
string - the random string
|
||||
error - an error stemming from an invalid parameter within underlying function, CryptoRandom(...)
|
||||
*/
|
||||
func CryptoRandomAlphabetic(count int) (string, error) {
|
||||
return CryptoRandom(count, 0, 0, true, false)
|
||||
}
|
||||
|
||||
/*
|
||||
CryptoRandomAlphaNumeric creates a random string whose length is the number of characters specified.
|
||||
Characters will be chosen from the set of alpha-numeric characters.
|
||||
|
||||
Parameter:
|
||||
count - the length of random string to create
|
||||
|
||||
Returns:
|
||||
string - the random string
|
||||
error - an error stemming from an invalid parameter within underlying function, CryptoRandom(...)
|
||||
*/
|
||||
func CryptoRandomAlphaNumeric(count int) (string, error) {
|
||||
return CryptoRandom(count, 0, 0, true, true)
|
||||
}
|
||||
|
||||
/*
|
||||
CryptoRandomAlphaNumericCustom creates a random string whose length is the number of characters specified.
|
||||
Characters will be chosen from the set of alpha-numeric characters as indicated by the arguments.
|
||||
|
||||
Parameters:
|
||||
count - the length of random string to create
|
||||
letters - if true, generated string may include alphabetic characters
|
||||
numbers - if true, generated string may include numeric characters
|
||||
|
||||
Returns:
|
||||
string - the random string
|
||||
error - an error stemming from an invalid parameter within underlying function, CryptoRandom(...)
|
||||
*/
|
||||
func CryptoRandomAlphaNumericCustom(count int, letters bool, numbers bool) (string, error) {
|
||||
return CryptoRandom(count, 0, 0, letters, numbers)
|
||||
}
|
||||
|
||||
/*
|
||||
CryptoRandom creates a random string based on a variety of options, using using golang's crypto/rand source of randomness.
|
||||
If the parameters start and end are both 0, start and end are set to ' ' and 'z', the ASCII printable characters, will be used,
|
||||
unless letters and numbers are both false, in which case, start and end are set to 0 and math.MaxInt32, respectively.
|
||||
If chars is not nil, characters stored in chars that are between start and end are chosen.
|
||||
|
||||
Parameters:
|
||||
count - the length of random string to create
|
||||
start - the position in set of chars (ASCII/Unicode int) to start at
|
||||
end - the position in set of chars (ASCII/Unicode int) to end before
|
||||
letters - if true, generated string may include alphabetic characters
|
||||
numbers - if true, generated string may include numeric characters
|
||||
chars - the set of chars to choose randoms from. If nil, then it will use the set of all chars.
|
||||
|
||||
Returns:
|
||||
string - the random string
|
||||
error - an error stemming from invalid parameters: if count < 0; or the provided chars array is empty; or end <= start; or end > len(chars)
|
||||
*/
|
||||
func CryptoRandom(count int, start int, end int, letters bool, numbers bool, chars ...rune) (string, error) {
|
||||
if count == 0 {
|
||||
return "", nil
|
||||
} else if count < 0 {
|
||||
err := fmt.Errorf("randomstringutils illegal argument: Requested random string length %v is less than 0.", count) // equiv to err := errors.New("...")
|
||||
return "", err
|
||||
}
|
||||
if chars != nil && len(chars) == 0 {
|
||||
err := fmt.Errorf("randomstringutils illegal argument: The chars array must not be empty")
|
||||
return "", err
|
||||
}
|
||||
|
||||
if start == 0 && end == 0 {
|
||||
if chars != nil {
|
||||
end = len(chars)
|
||||
} else {
|
||||
if !letters && !numbers {
|
||||
end = math.MaxInt32
|
||||
} else {
|
||||
end = 'z' + 1
|
||||
start = ' '
|
||||
}
|
||||
}
|
||||
} else {
|
||||
if end <= start {
|
||||
err := fmt.Errorf("randomstringutils illegal argument: Parameter end (%v) must be greater than start (%v)", end, start)
|
||||
return "", err
|
||||
}
|
||||
|
||||
if chars != nil && end > len(chars) {
|
||||
err := fmt.Errorf("randomstringutils illegal argument: Parameter end (%v) cannot be greater than len(chars) (%v)", end, len(chars))
|
||||
return "", err
|
||||
}
|
||||
}
|
||||
|
||||
buffer := make([]rune, count)
|
||||
gap := end - start
|
||||
|
||||
// high-surrogates range, (\uD800-\uDBFF) = 55296 - 56319
|
||||
// low-surrogates range, (\uDC00-\uDFFF) = 56320 - 57343
|
||||
|
||||
for count != 0 {
|
||||
count--
|
||||
var ch rune
|
||||
if chars == nil {
|
||||
ch = rune(getCryptoRandomInt(gap) + int64(start))
|
||||
} else {
|
||||
ch = chars[getCryptoRandomInt(gap)+int64(start)]
|
||||
}
|
||||
|
||||
if letters && unicode.IsLetter(ch) || numbers && unicode.IsDigit(ch) || !letters && !numbers {
|
||||
if ch >= 56320 && ch <= 57343 { // low surrogate range
|
||||
if count == 0 {
|
||||
count++
|
||||
} else {
|
||||
// Insert low surrogate
|
||||
buffer[count] = ch
|
||||
count--
|
||||
// Insert high surrogate
|
||||
buffer[count] = rune(55296 + getCryptoRandomInt(128))
|
||||
}
|
||||
} else if ch >= 55296 && ch <= 56191 { // High surrogates range (Partial)
|
||||
if count == 0 {
|
||||
count++
|
||||
} else {
|
||||
// Insert low surrogate
|
||||
buffer[count] = rune(56320 + getCryptoRandomInt(128))
|
||||
count--
|
||||
// Insert high surrogate
|
||||
buffer[count] = ch
|
||||
}
|
||||
} else if ch >= 56192 && ch <= 56319 {
|
||||
// private high surrogate, skip it
|
||||
count++
|
||||
} else {
|
||||
// not one of the surrogates*
|
||||
buffer[count] = ch
|
||||
}
|
||||
} else {
|
||||
count++
|
||||
}
|
||||
}
|
||||
return string(buffer), nil
|
||||
}
|
||||
|
||||
func getCryptoRandomInt(count int) int64 {
|
||||
nBig, err := rand.Int(rand.Reader, big.NewInt(int64(count)))
|
||||
if err != nil {
|
||||
panic(err)
|
||||
}
|
||||
return nBig.Int64()
|
||||
}
|
||||
248
vendor/github.com/Masterminds/goutils/randomstringutils.go
generated
vendored
Normal file
248
vendor/github.com/Masterminds/goutils/randomstringutils.go
generated
vendored
Normal file
|
|
@ -0,0 +1,248 @@
|
|||
/*
|
||||
Copyright 2014 Alexander Okoli
|
||||
|
||||
Licensed under the Apache License, Version 2.0 (the "License");
|
||||
you may not use this file except in compliance with the License.
|
||||
You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing, software
|
||||
distributed under the License is distributed on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
See the License for the specific language governing permissions and
|
||||
limitations under the License.
|
||||
*/
|
||||
|
||||
package goutils
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"math"
|
||||
"math/rand"
|
||||
"time"
|
||||
"unicode"
|
||||
)
|
||||
|
||||
// RANDOM provides the time-based seed used to generate random numbers
|
||||
var RANDOM = rand.New(rand.NewSource(time.Now().UnixNano()))
|
||||
|
||||
/*
|
||||
RandomNonAlphaNumeric creates a random string whose length is the number of characters specified.
|
||||
Characters will be chosen from the set of all characters (ASCII/Unicode values between 0 to 2,147,483,647 (math.MaxInt32)).
|
||||
|
||||
Parameter:
|
||||
count - the length of random string to create
|
||||
|
||||
Returns:
|
||||
string - the random string
|
||||
error - an error stemming from an invalid parameter within underlying function, RandomSeed(...)
|
||||
*/
|
||||
func RandomNonAlphaNumeric(count int) (string, error) {
|
||||
return RandomAlphaNumericCustom(count, false, false)
|
||||
}
|
||||
|
||||
/*
|
||||
RandomAscii creates a random string whose length is the number of characters specified.
|
||||
Characters will be chosen from the set of characters whose ASCII value is between 32 and 126 (inclusive).
|
||||
|
||||
Parameter:
|
||||
count - the length of random string to create
|
||||
|
||||
Returns:
|
||||
string - the random string
|
||||
error - an error stemming from an invalid parameter within underlying function, RandomSeed(...)
|
||||
*/
|
||||
func RandomAscii(count int) (string, error) {
|
||||
return Random(count, 32, 127, false, false)
|
||||
}
|
||||
|
||||
/*
|
||||
RandomNumeric creates a random string whose length is the number of characters specified.
|
||||
Characters will be chosen from the set of numeric characters.
|
||||
|
||||
Parameter:
|
||||
count - the length of random string to create
|
||||
|
||||
Returns:
|
||||
string - the random string
|
||||
error - an error stemming from an invalid parameter within underlying function, RandomSeed(...)
|
||||
*/
|
||||
func RandomNumeric(count int) (string, error) {
|
||||
return Random(count, 0, 0, false, true)
|
||||
}
|
||||
|
||||
/*
|
||||
RandomAlphabetic creates a random string whose length is the number of characters specified.
|
||||
Characters will be chosen from the set of alphabetic characters.
|
||||
|
||||
Parameters:
|
||||
count - the length of random string to create
|
||||
|
||||
Returns:
|
||||
string - the random string
|
||||
error - an error stemming from an invalid parameter within underlying function, RandomSeed(...)
|
||||
*/
|
||||
func RandomAlphabetic(count int) (string, error) {
|
||||
return Random(count, 0, 0, true, false)
|
||||
}
|
||||
|
||||
/*
|
||||
RandomAlphaNumeric creates a random string whose length is the number of characters specified.
|
||||
Characters will be chosen from the set of alpha-numeric characters.
|
||||
|
||||
Parameter:
|
||||
count - the length of random string to create
|
||||
|
||||
Returns:
|
||||
string - the random string
|
||||
error - an error stemming from an invalid parameter within underlying function, RandomSeed(...)
|
||||
*/
|
||||
func RandomAlphaNumeric(count int) (string, error) {
|
||||
return Random(count, 0, 0, true, true)
|
||||
}
|
||||
|
||||
/*
|
||||
RandomAlphaNumericCustom creates a random string whose length is the number of characters specified.
|
||||
Characters will be chosen from the set of alpha-numeric characters as indicated by the arguments.
|
||||
|
||||
Parameters:
|
||||
count - the length of random string to create
|
||||
letters - if true, generated string may include alphabetic characters
|
||||
numbers - if true, generated string may include numeric characters
|
||||
|
||||
Returns:
|
||||
string - the random string
|
||||
error - an error stemming from an invalid parameter within underlying function, RandomSeed(...)
|
||||
*/
|
||||
func RandomAlphaNumericCustom(count int, letters bool, numbers bool) (string, error) {
|
||||
return Random(count, 0, 0, letters, numbers)
|
||||
}
|
||||
|
||||
/*
|
||||
Random creates a random string based on a variety of options, using default source of randomness.
|
||||
This method has exactly the same semantics as RandomSeed(int, int, int, bool, bool, []char, *rand.Rand), but
|
||||
instead of using an externally supplied source of randomness, it uses the internal *rand.Rand instance.
|
||||
|
||||
Parameters:
|
||||
count - the length of random string to create
|
||||
start - the position in set of chars (ASCII/Unicode int) to start at
|
||||
end - the position in set of chars (ASCII/Unicode int) to end before
|
||||
letters - if true, generated string may include alphabetic characters
|
||||
numbers - if true, generated string may include numeric characters
|
||||
chars - the set of chars to choose randoms from. If nil, then it will use the set of all chars.
|
||||
|
||||
Returns:
|
||||
string - the random string
|
||||
error - an error stemming from an invalid parameter within underlying function, RandomSeed(...)
|
||||
*/
|
||||
func Random(count int, start int, end int, letters bool, numbers bool, chars ...rune) (string, error) {
|
||||
return RandomSeed(count, start, end, letters, numbers, chars, RANDOM)
|
||||
}
|
||||
|
||||
/*
|
||||
RandomSeed creates a random string based on a variety of options, using supplied source of randomness.
|
||||
If the parameters start and end are both 0, start and end are set to ' ' and 'z', the ASCII printable characters, will be used,
|
||||
unless letters and numbers are both false, in which case, start and end are set to 0 and math.MaxInt32, respectively.
|
||||
If chars is not nil, characters stored in chars that are between start and end are chosen.
|
||||
This method accepts a user-supplied *rand.Rand instance to use as a source of randomness. By seeding a single *rand.Rand instance
|
||||
with a fixed seed and using it for each call, the same random sequence of strings can be generated repeatedly and predictably.
|
||||
|
||||
Parameters:
|
||||
count - the length of random string to create
|
||||
start - the position in set of chars (ASCII/Unicode decimals) to start at
|
||||
end - the position in set of chars (ASCII/Unicode decimals) to end before
|
||||
letters - if true, generated string may include alphabetic characters
|
||||
numbers - if true, generated string may include numeric characters
|
||||
chars - the set of chars to choose randoms from. If nil, then it will use the set of all chars.
|
||||
random - a source of randomness.
|
||||
|
||||
Returns:
|
||||
string - the random string
|
||||
error - an error stemming from invalid parameters: if count < 0; or the provided chars array is empty; or end <= start; or end > len(chars)
|
||||
*/
|
||||
func RandomSeed(count int, start int, end int, letters bool, numbers bool, chars []rune, random *rand.Rand) (string, error) {
|
||||
|
||||
if count == 0 {
|
||||
return "", nil
|
||||
} else if count < 0 {
|
||||
err := fmt.Errorf("randomstringutils illegal argument: Requested random string length %v is less than 0.", count) // equiv to err := errors.New("...")
|
||||
return "", err
|
||||
}
|
||||
if chars != nil && len(chars) == 0 {
|
||||
err := fmt.Errorf("randomstringutils illegal argument: The chars array must not be empty")
|
||||
return "", err
|
||||
}
|
||||
|
||||
if start == 0 && end == 0 {
|
||||
if chars != nil {
|
||||
end = len(chars)
|
||||
} else {
|
||||
if !letters && !numbers {
|
||||
end = math.MaxInt32
|
||||
} else {
|
||||
end = 'z' + 1
|
||||
start = ' '
|
||||
}
|
||||
}
|
||||
} else {
|
||||
if end <= start {
|
||||
err := fmt.Errorf("randomstringutils illegal argument: Parameter end (%v) must be greater than start (%v)", end, start)
|
||||
return "", err
|
||||
}
|
||||
|
||||
if chars != nil && end > len(chars) {
|
||||
err := fmt.Errorf("randomstringutils illegal argument: Parameter end (%v) cannot be greater than len(chars) (%v)", end, len(chars))
|
||||
return "", err
|
||||
}
|
||||
}
|
||||
|
||||
buffer := make([]rune, count)
|
||||
gap := end - start
|
||||
|
||||
// high-surrogates range, (\uD800-\uDBFF) = 55296 - 56319
|
||||
// low-surrogates range, (\uDC00-\uDFFF) = 56320 - 57343
|
||||
|
||||
for count != 0 {
|
||||
count--
|
||||
var ch rune
|
||||
if chars == nil {
|
||||
ch = rune(random.Intn(gap) + start)
|
||||
} else {
|
||||
ch = chars[random.Intn(gap)+start]
|
||||
}
|
||||
|
||||
if letters && unicode.IsLetter(ch) || numbers && unicode.IsDigit(ch) || !letters && !numbers {
|
||||
if ch >= 56320 && ch <= 57343 { // low surrogate range
|
||||
if count == 0 {
|
||||
count++
|
||||
} else {
|
||||
// Insert low surrogate
|
||||
buffer[count] = ch
|
||||
count--
|
||||
// Insert high surrogate
|
||||
buffer[count] = rune(55296 + random.Intn(128))
|
||||
}
|
||||
} else if ch >= 55296 && ch <= 56191 { // High surrogates range (Partial)
|
||||
if count == 0 {
|
||||
count++
|
||||
} else {
|
||||
// Insert low surrogate
|
||||
buffer[count] = rune(56320 + random.Intn(128))
|
||||
count--
|
||||
// Insert high surrogate
|
||||
buffer[count] = ch
|
||||
}
|
||||
} else if ch >= 56192 && ch <= 56319 {
|
||||
// private high surrogate, skip it
|
||||
count++
|
||||
} else {
|
||||
// not one of the surrogates*
|
||||
buffer[count] = ch
|
||||
}
|
||||
} else {
|
||||
count++
|
||||
}
|
||||
}
|
||||
return string(buffer), nil
|
||||
}
|
||||
240
vendor/github.com/Masterminds/goutils/stringutils.go
generated
vendored
Normal file
240
vendor/github.com/Masterminds/goutils/stringutils.go
generated
vendored
Normal file
|
|
@ -0,0 +1,240 @@
|
|||
/*
|
||||
Copyright 2014 Alexander Okoli
|
||||
|
||||
Licensed under the Apache License, Version 2.0 (the "License");
|
||||
you may not use this file except in compliance with the License.
|
||||
You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing, software
|
||||
distributed under the License is distributed on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
See the License for the specific language governing permissions and
|
||||
limitations under the License.
|
||||
*/
|
||||
|
||||
package goutils
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"fmt"
|
||||
"strings"
|
||||
"unicode"
|
||||
)
|
||||
|
||||
// Typically returned by functions where a searched item cannot be found
|
||||
const INDEX_NOT_FOUND = -1
|
||||
|
||||
/*
|
||||
Abbreviate abbreviates a string using ellipses. This will turn the string "Now is the time for all good men" into "Now is the time for..."
|
||||
|
||||
Specifically, the algorithm is as follows:
|
||||
|
||||
- If str is less than maxWidth characters long, return it.
|
||||
- Else abbreviate it to (str[0:maxWidth - 3] + "...").
|
||||
- If maxWidth is less than 4, return an illegal argument error.
|
||||
- In no case will it return a string of length greater than maxWidth.
|
||||
|
||||
Parameters:
|
||||
str - the string to check
|
||||
maxWidth - maximum length of result string, must be at least 4
|
||||
|
||||
Returns:
|
||||
string - abbreviated string
|
||||
error - if the width is too small
|
||||
*/
|
||||
func Abbreviate(str string, maxWidth int) (string, error) {
|
||||
return AbbreviateFull(str, 0, maxWidth)
|
||||
}
|
||||
|
||||
/*
|
||||
AbbreviateFull abbreviates a string using ellipses. This will turn the string "Now is the time for all good men" into "...is the time for..."
|
||||
This function works like Abbreviate(string, int), but allows you to specify a "left edge" offset. Note that this left edge is not
|
||||
necessarily going to be the leftmost character in the result, or the first character following the ellipses, but it will appear
|
||||
somewhere in the result.
|
||||
In no case will it return a string of length greater than maxWidth.
|
||||
|
||||
Parameters:
|
||||
str - the string to check
|
||||
offset - left edge of source string
|
||||
maxWidth - maximum length of result string, must be at least 4
|
||||
|
||||
Returns:
|
||||
string - abbreviated string
|
||||
error - if the width is too small
|
||||
*/
|
||||
func AbbreviateFull(str string, offset int, maxWidth int) (string, error) {
|
||||
if str == "" {
|
||||
return "", nil
|
||||
}
|
||||
if maxWidth < 4 {
|
||||
err := fmt.Errorf("stringutils illegal argument: Minimum abbreviation width is 4")
|
||||
return "", err
|
||||
}
|
||||
if len(str) <= maxWidth {
|
||||
return str, nil
|
||||
}
|
||||
if offset > len(str) {
|
||||
offset = len(str)
|
||||
}
|
||||
if len(str)-offset < (maxWidth - 3) { // 15 - 5 < 10 - 3 = 10 < 7
|
||||
offset = len(str) - (maxWidth - 3)
|
||||
}
|
||||
abrevMarker := "..."
|
||||
if offset <= 4 {
|
||||
return str[0:maxWidth-3] + abrevMarker, nil // str.substring(0, maxWidth - 3) + abrevMarker;
|
||||
}
|
||||
if maxWidth < 7 {
|
||||
err := fmt.Errorf("stringutils illegal argument: Minimum abbreviation width with offset is 7")
|
||||
return "", err
|
||||
}
|
||||
if (offset + maxWidth - 3) < len(str) { // 5 + (10-3) < 15 = 12 < 15
|
||||
abrevStr, _ := Abbreviate(str[offset:len(str)], (maxWidth - 3))
|
||||
return abrevMarker + abrevStr, nil // abrevMarker + abbreviate(str.substring(offset), maxWidth - 3);
|
||||
}
|
||||
return abrevMarker + str[(len(str)-(maxWidth-3)):len(str)], nil // abrevMarker + str.substring(str.length() - (maxWidth - 3));
|
||||
}
|
||||
|
||||
/*
|
||||
DeleteWhiteSpace deletes all whitespaces from a string as defined by unicode.IsSpace(rune).
|
||||
It returns the string without whitespaces.
|
||||
|
||||
Parameter:
|
||||
str - the string to delete whitespace from, may be nil
|
||||
|
||||
Returns:
|
||||
the string without whitespaces
|
||||
*/
|
||||
func DeleteWhiteSpace(str string) string {
|
||||
if str == "" {
|
||||
return str
|
||||
}
|
||||
sz := len(str)
|
||||
var chs bytes.Buffer
|
||||
count := 0
|
||||
for i := 0; i < sz; i++ {
|
||||
ch := rune(str[i])
|
||||
if !unicode.IsSpace(ch) {
|
||||
chs.WriteRune(ch)
|
||||
count++
|
||||
}
|
||||
}
|
||||
if count == sz {
|
||||
return str
|
||||
}
|
||||
return chs.String()
|
||||
}
|
||||
|
||||
/*
|
||||
IndexOfDifference compares two strings, and returns the index at which the strings begin to differ.
|
||||
|
||||
Parameters:
|
||||
str1 - the first string
|
||||
str2 - the second string
|
||||
|
||||
Returns:
|
||||
the index where str1 and str2 begin to differ; -1 if they are equal
|
||||
*/
|
||||
func IndexOfDifference(str1 string, str2 string) int {
|
||||
if str1 == str2 {
|
||||
return INDEX_NOT_FOUND
|
||||
}
|
||||
if IsEmpty(str1) || IsEmpty(str2) {
|
||||
return 0
|
||||
}
|
||||
var i int
|
||||
for i = 0; i < len(str1) && i < len(str2); i++ {
|
||||
if rune(str1[i]) != rune(str2[i]) {
|
||||
break
|
||||
}
|
||||
}
|
||||
if i < len(str2) || i < len(str1) {
|
||||
return i
|
||||
}
|
||||
return INDEX_NOT_FOUND
|
||||
}
|
||||
|
||||
/*
|
||||
IsBlank checks if a string is whitespace or empty (""). Observe the following behavior:
|
||||
|
||||
goutils.IsBlank("") = true
|
||||
goutils.IsBlank(" ") = true
|
||||
goutils.IsBlank("bob") = false
|
||||
goutils.IsBlank(" bob ") = false
|
||||
|
||||
Parameter:
|
||||
str - the string to check
|
||||
|
||||
Returns:
|
||||
true - if the string is whitespace or empty ("")
|
||||
*/
|
||||
func IsBlank(str string) bool {
|
||||
strLen := len(str)
|
||||
if str == "" || strLen == 0 {
|
||||
return true
|
||||
}
|
||||
for i := 0; i < strLen; i++ {
|
||||
if unicode.IsSpace(rune(str[i])) == false {
|
||||
return false
|
||||
}
|
||||
}
|
||||
return true
|
||||
}
|
||||
|
||||
/*
|
||||
IndexOf returns the index of the first instance of sub in str, with the search beginning from the
|
||||
index start point specified. -1 is returned if sub is not present in str.
|
||||
|
||||
An empty string ("") will return -1 (INDEX_NOT_FOUND). A negative start position is treated as zero.
|
||||
A start position greater than the string length returns -1.
|
||||
|
||||
Parameters:
|
||||
str - the string to check
|
||||
sub - the substring to find
|
||||
start - the start position; negative treated as zero
|
||||
|
||||
Returns:
|
||||
the first index where the sub string was found (always >= start)
|
||||
*/
|
||||
func IndexOf(str string, sub string, start int) int {
|
||||
|
||||
if start < 0 {
|
||||
start = 0
|
||||
}
|
||||
|
||||
if len(str) < start {
|
||||
return INDEX_NOT_FOUND
|
||||
}
|
||||
|
||||
if IsEmpty(str) || IsEmpty(sub) {
|
||||
return INDEX_NOT_FOUND
|
||||
}
|
||||
|
||||
partialIndex := strings.Index(str[start:len(str)], sub)
|
||||
if partialIndex == -1 {
|
||||
return INDEX_NOT_FOUND
|
||||
}
|
||||
return partialIndex + start
|
||||
}
|
||||
|
||||
// IsEmpty checks if a string is empty (""). Returns true if empty, and false otherwise.
|
||||
func IsEmpty(str string) bool {
|
||||
return len(str) == 0
|
||||
}
|
||||
|
||||
// Returns either the passed in string, or if the string is empty, the value of defaultStr.
|
||||
func DefaultString(str string, defaultStr string) string {
|
||||
if IsEmpty(str) {
|
||||
return defaultStr
|
||||
}
|
||||
return str
|
||||
}
|
||||
|
||||
// Returns either the passed in string, or if the string is whitespace, empty (""), the value of defaultStr.
|
||||
func DefaultIfBlank(str string, defaultStr string) string {
|
||||
if IsBlank(str) {
|
||||
return defaultStr
|
||||
}
|
||||
return str
|
||||
}
|
||||
357
vendor/github.com/Masterminds/goutils/wordutils.go
generated
vendored
Normal file
357
vendor/github.com/Masterminds/goutils/wordutils.go
generated
vendored
Normal file
|
|
@ -0,0 +1,357 @@
|
|||
/*
|
||||
Copyright 2014 Alexander Okoli
|
||||
|
||||
Licensed under the Apache License, Version 2.0 (the "License");
|
||||
you may not use this file except in compliance with the License.
|
||||
You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing, software
|
||||
distributed under the License is distributed on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
See the License for the specific language governing permissions and
|
||||
limitations under the License.
|
||||
*/
|
||||
|
||||
/*
|
||||
Package goutils provides utility functions to manipulate strings in various ways.
|
||||
The code snippets below show examples of how to use goutils. Some functions return
|
||||
errors while others do not, so usage would vary as a result.
|
||||
|
||||
Example:
|
||||
|
||||
package main
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"github.com/aokoli/goutils"
|
||||
)
|
||||
|
||||
func main() {
|
||||
|
||||
// EXAMPLE 1: A goutils function which returns no errors
|
||||
fmt.Println (goutils.Initials("John Doe Foo")) // Prints out "JDF"
|
||||
|
||||
|
||||
|
||||
// EXAMPLE 2: A goutils function which returns an error
|
||||
rand1, err1 := goutils.Random (-1, 0, 0, true, true)
|
||||
|
||||
if err1 != nil {
|
||||
fmt.Println(err1) // Prints out error message because -1 was entered as the first parameter in goutils.Random(...)
|
||||
} else {
|
||||
fmt.Println(rand1)
|
||||
}
|
||||
}
|
||||
*/
|
||||
package goutils
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"strings"
|
||||
"unicode"
|
||||
)
|
||||
|
||||
// VERSION indicates the current version of goutils
|
||||
const VERSION = "1.0.0"
|
||||
|
||||
/*
|
||||
Wrap wraps a single line of text, identifying words by ' '.
|
||||
New lines will be separated by '\n'. Very long words, such as URLs will not be wrapped.
|
||||
Leading spaces on a new line are stripped. Trailing spaces are not stripped.
|
||||
|
||||
Parameters:
|
||||
str - the string to be word wrapped
|
||||
wrapLength - the column (a column can fit only one character) to wrap the words at, less than 1 is treated as 1
|
||||
|
||||
Returns:
|
||||
a line with newlines inserted
|
||||
*/
|
||||
func Wrap(str string, wrapLength int) string {
|
||||
return WrapCustom(str, wrapLength, "", false)
|
||||
}
|
||||
|
||||
/*
|
||||
WrapCustom wraps a single line of text, identifying words by ' '.
|
||||
Leading spaces on a new line are stripped. Trailing spaces are not stripped.
|
||||
|
||||
Parameters:
|
||||
str - the string to be word wrapped
|
||||
wrapLength - the column number (a column can fit only one character) to wrap the words at, less than 1 is treated as 1
|
||||
newLineStr - the string to insert for a new line, "" uses '\n'
|
||||
wrapLongWords - true if long words (such as URLs) should be wrapped
|
||||
|
||||
Returns:
|
||||
a line with newlines inserted
|
||||
*/
|
||||
func WrapCustom(str string, wrapLength int, newLineStr string, wrapLongWords bool) string {
|
||||
|
||||
if str == "" {
|
||||
return ""
|
||||
}
|
||||
if newLineStr == "" {
|
||||
newLineStr = "\n" // TODO Assumes "\n" is seperator. Explore SystemUtils.LINE_SEPARATOR from Apache Commons
|
||||
}
|
||||
if wrapLength < 1 {
|
||||
wrapLength = 1
|
||||
}
|
||||
|
||||
inputLineLength := len(str)
|
||||
offset := 0
|
||||
|
||||
var wrappedLine bytes.Buffer
|
||||
|
||||
for inputLineLength-offset > wrapLength {
|
||||
|
||||
if rune(str[offset]) == ' ' {
|
||||
offset++
|
||||
continue
|
||||
}
|
||||
|
||||
end := wrapLength + offset + 1
|
||||
spaceToWrapAt := strings.LastIndex(str[offset:end], " ") + offset
|
||||
|
||||
if spaceToWrapAt >= offset {
|
||||
// normal word (not longer than wrapLength)
|
||||
wrappedLine.WriteString(str[offset:spaceToWrapAt])
|
||||
wrappedLine.WriteString(newLineStr)
|
||||
offset = spaceToWrapAt + 1
|
||||
|
||||
} else {
|
||||
// long word or URL
|
||||
if wrapLongWords {
|
||||
end := wrapLength + offset
|
||||
// long words are wrapped one line at a time
|
||||
wrappedLine.WriteString(str[offset:end])
|
||||
wrappedLine.WriteString(newLineStr)
|
||||
offset += wrapLength
|
||||
} else {
|
||||
// long words aren't wrapped, just extended beyond limit
|
||||
end := wrapLength + offset
|
||||
index := strings.IndexRune(str[end:len(str)], ' ')
|
||||
if index == -1 {
|
||||
wrappedLine.WriteString(str[offset:len(str)])
|
||||
offset = inputLineLength
|
||||
} else {
|
||||
spaceToWrapAt = index + end
|
||||
wrappedLine.WriteString(str[offset:spaceToWrapAt])
|
||||
wrappedLine.WriteString(newLineStr)
|
||||
offset = spaceToWrapAt + 1
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
wrappedLine.WriteString(str[offset:len(str)])
|
||||
|
||||
return wrappedLine.String()
|
||||
|
||||
}
|
||||
|
||||
/*
|
||||
Capitalize capitalizes all the delimiter separated words in a string. Only the first letter of each word is changed.
|
||||
To convert the rest of each word to lowercase at the same time, use CapitalizeFully(str string, delimiters ...rune).
|
||||
The delimiters represent a set of characters understood to separate words. The first string character
|
||||
and the first non-delimiter character after a delimiter will be capitalized. A "" input string returns "".
|
||||
Capitalization uses the Unicode title case, normally equivalent to upper case.
|
||||
|
||||
Parameters:
|
||||
str - the string to capitalize
|
||||
delimiters - set of characters to determine capitalization, exclusion of this parameter means whitespace would be delimeter
|
||||
|
||||
Returns:
|
||||
capitalized string
|
||||
*/
|
||||
func Capitalize(str string, delimiters ...rune) string {
|
||||
|
||||
var delimLen int
|
||||
|
||||
if delimiters == nil {
|
||||
delimLen = -1
|
||||
} else {
|
||||
delimLen = len(delimiters)
|
||||
}
|
||||
|
||||
if str == "" || delimLen == 0 {
|
||||
return str
|
||||
}
|
||||
|
||||
buffer := []rune(str)
|
||||
capitalizeNext := true
|
||||
for i := 0; i < len(buffer); i++ {
|
||||
ch := buffer[i]
|
||||
if isDelimiter(ch, delimiters...) {
|
||||
capitalizeNext = true
|
||||
} else if capitalizeNext {
|
||||
buffer[i] = unicode.ToTitle(ch)
|
||||
capitalizeNext = false
|
||||
}
|
||||
}
|
||||
return string(buffer)
|
||||
|
||||
}
|
||||
|
||||
/*
|
||||
CapitalizeFully converts all the delimiter separated words in a string into capitalized words, that is each word is made up of a
|
||||
titlecase character and then a series of lowercase characters. The delimiters represent a set of characters understood
|
||||
to separate words. The first string character and the first non-delimiter character after a delimiter will be capitalized.
|
||||
Capitalization uses the Unicode title case, normally equivalent to upper case.
|
||||
|
||||
Parameters:
|
||||
str - the string to capitalize fully
|
||||
delimiters - set of characters to determine capitalization, exclusion of this parameter means whitespace would be delimeter
|
||||
|
||||
Returns:
|
||||
capitalized string
|
||||
*/
|
||||
func CapitalizeFully(str string, delimiters ...rune) string {
|
||||
|
||||
var delimLen int
|
||||
|
||||
if delimiters == nil {
|
||||
delimLen = -1
|
||||
} else {
|
||||
delimLen = len(delimiters)
|
||||
}
|
||||
|
||||
if str == "" || delimLen == 0 {
|
||||
return str
|
||||
}
|
||||
str = strings.ToLower(str)
|
||||
return Capitalize(str, delimiters...)
|
||||
}
|
||||
|
||||
/*
|
||||
Uncapitalize uncapitalizes all the whitespace separated words in a string. Only the first letter of each word is changed.
|
||||
The delimiters represent a set of characters understood to separate words. The first string character and the first non-delimiter
|
||||
character after a delimiter will be uncapitalized. Whitespace is defined by unicode.IsSpace(char).
|
||||
|
||||
Parameters:
|
||||
str - the string to uncapitalize fully
|
||||
delimiters - set of characters to determine capitalization, exclusion of this parameter means whitespace would be delimeter
|
||||
|
||||
Returns:
|
||||
uncapitalized string
|
||||
*/
|
||||
func Uncapitalize(str string, delimiters ...rune) string {
|
||||
|
||||
var delimLen int
|
||||
|
||||
if delimiters == nil {
|
||||
delimLen = -1
|
||||
} else {
|
||||
delimLen = len(delimiters)
|
||||
}
|
||||
|
||||
if str == "" || delimLen == 0 {
|
||||
return str
|
||||
}
|
||||
|
||||
buffer := []rune(str)
|
||||
uncapitalizeNext := true // TODO Always makes capitalize/un apply to first char.
|
||||
for i := 0; i < len(buffer); i++ {
|
||||
ch := buffer[i]
|
||||
if isDelimiter(ch, delimiters...) {
|
||||
uncapitalizeNext = true
|
||||
} else if uncapitalizeNext {
|
||||
buffer[i] = unicode.ToLower(ch)
|
||||
uncapitalizeNext = false
|
||||
}
|
||||
}
|
||||
return string(buffer)
|
||||
}
|
||||
|
||||
/*
|
||||
SwapCase swaps the case of a string using a word based algorithm.
|
||||
|
||||
Conversion algorithm:
|
||||
|
||||
Upper case character converts to Lower case
|
||||
Title case character converts to Lower case
|
||||
Lower case character after Whitespace or at start converts to Title case
|
||||
Other Lower case character converts to Upper case
|
||||
Whitespace is defined by unicode.IsSpace(char).
|
||||
|
||||
Parameters:
|
||||
str - the string to swap case
|
||||
|
||||
Returns:
|
||||
the changed string
|
||||
*/
|
||||
func SwapCase(str string) string {
|
||||
if str == "" {
|
||||
return str
|
||||
}
|
||||
buffer := []rune(str)
|
||||
|
||||
whitespace := true
|
||||
|
||||
for i := 0; i < len(buffer); i++ {
|
||||
ch := buffer[i]
|
||||
if unicode.IsUpper(ch) {
|
||||
buffer[i] = unicode.ToLower(ch)
|
||||
whitespace = false
|
||||
} else if unicode.IsTitle(ch) {
|
||||
buffer[i] = unicode.ToLower(ch)
|
||||
whitespace = false
|
||||
} else if unicode.IsLower(ch) {
|
||||
if whitespace {
|
||||
buffer[i] = unicode.ToTitle(ch)
|
||||
whitespace = false
|
||||
} else {
|
||||
buffer[i] = unicode.ToUpper(ch)
|
||||
}
|
||||
} else {
|
||||
whitespace = unicode.IsSpace(ch)
|
||||
}
|
||||
}
|
||||
return string(buffer)
|
||||
}
|
||||
|
||||
/*
|
||||
Initials extracts the initial letters from each word in the string. The first letter of the string and all first
|
||||
letters after the defined delimiters are returned as a new string. Their case is not changed. If the delimiters
|
||||
parameter is excluded, then Whitespace is used. Whitespace is defined by unicode.IsSpacea(char). An empty delimiter array returns an empty string.
|
||||
|
||||
Parameters:
|
||||
str - the string to get initials from
|
||||
delimiters - set of characters to determine words, exclusion of this parameter means whitespace would be delimeter
|
||||
Returns:
|
||||
string of initial letters
|
||||
*/
|
||||
func Initials(str string, delimiters ...rune) string {
|
||||
if str == "" {
|
||||
return str
|
||||
}
|
||||
if delimiters != nil && len(delimiters) == 0 {
|
||||
return ""
|
||||
}
|
||||
strLen := len(str)
|
||||
var buf bytes.Buffer
|
||||
lastWasGap := true
|
||||
for i := 0; i < strLen; i++ {
|
||||
ch := rune(str[i])
|
||||
|
||||
if isDelimiter(ch, delimiters...) {
|
||||
lastWasGap = true
|
||||
} else if lastWasGap {
|
||||
buf.WriteRune(ch)
|
||||
lastWasGap = false
|
||||
}
|
||||
}
|
||||
return buf.String()
|
||||
}
|
||||
|
||||
// private function (lower case func name)
|
||||
func isDelimiter(ch rune, delimiters ...rune) bool {
|
||||
if delimiters == nil {
|
||||
return unicode.IsSpace(ch)
|
||||
}
|
||||
for _, delimiter := range delimiters {
|
||||
if ch == delimiter {
|
||||
return true
|
||||
}
|
||||
}
|
||||
return false
|
||||
}
|
||||
1
vendor/github.com/Masterminds/semver/v3/.gitignore
generated
vendored
Normal file
1
vendor/github.com/Masterminds/semver/v3/.gitignore
generated
vendored
Normal file
|
|
@ -0,0 +1 @@
|
|||
_fuzz/
|
||||
30
vendor/github.com/Masterminds/semver/v3/.golangci.yml
generated
vendored
Normal file
30
vendor/github.com/Masterminds/semver/v3/.golangci.yml
generated
vendored
Normal file
|
|
@ -0,0 +1,30 @@
|
|||
run:
|
||||
deadline: 2m
|
||||
|
||||
linters:
|
||||
disable-all: true
|
||||
enable:
|
||||
- misspell
|
||||
- structcheck
|
||||
- govet
|
||||
- staticcheck
|
||||
- deadcode
|
||||
- errcheck
|
||||
- varcheck
|
||||
- unparam
|
||||
- ineffassign
|
||||
- nakedret
|
||||
- gocyclo
|
||||
- dupl
|
||||
- goimports
|
||||
- revive
|
||||
- gosec
|
||||
- gosimple
|
||||
- typecheck
|
||||
- unused
|
||||
|
||||
linters-settings:
|
||||
gofmt:
|
||||
simplify: true
|
||||
dupl:
|
||||
threshold: 600
|
||||
214
vendor/github.com/Masterminds/semver/v3/CHANGELOG.md
generated
vendored
Normal file
214
vendor/github.com/Masterminds/semver/v3/CHANGELOG.md
generated
vendored
Normal file
|
|
@ -0,0 +1,214 @@
|
|||
# Changelog
|
||||
|
||||
## 3.2.0 (2022-11-28)
|
||||
|
||||
### Added
|
||||
|
||||
- #190: Added text marshaling and unmarshaling
|
||||
- #167: Added JSON marshalling for constraints (thanks @SimonTheLeg)
|
||||
- #173: Implement encoding.TextMarshaler and encoding.TextUnmarshaler on Version (thanks @MarkRosemaker)
|
||||
- #179: Added New() version constructor (thanks @kazhuravlev)
|
||||
|
||||
### Changed
|
||||
|
||||
- #182/#183: Updated CI testing setup
|
||||
|
||||
### Fixed
|
||||
|
||||
- #186: Fixing issue where validation of constraint section gave false positives
|
||||
- #176: Fix constraints check with *-0 (thanks @mtt0)
|
||||
- #181: Fixed Caret operator (^) gives unexpected results when the minor version in constraint is 0 (thanks @arshchimni)
|
||||
- #161: Fixed godoc (thanks @afirth)
|
||||
|
||||
## 3.1.1 (2020-11-23)
|
||||
|
||||
### Fixed
|
||||
|
||||
- #158: Fixed issue with generated regex operation order that could cause problem
|
||||
|
||||
## 3.1.0 (2020-04-15)
|
||||
|
||||
### Added
|
||||
|
||||
- #131: Add support for serializing/deserializing SQL (thanks @ryancurrah)
|
||||
|
||||
### Changed
|
||||
|
||||
- #148: More accurate validation messages on constraints
|
||||
|
||||
## 3.0.3 (2019-12-13)
|
||||
|
||||
### Fixed
|
||||
|
||||
- #141: Fixed issue with <= comparison
|
||||
|
||||
## 3.0.2 (2019-11-14)
|
||||
|
||||
### Fixed
|
||||
|
||||
- #134: Fixed broken constraint checking with ^0.0 (thanks @krmichelos)
|
||||
|
||||
## 3.0.1 (2019-09-13)
|
||||
|
||||
### Fixed
|
||||
|
||||
- #125: Fixes issue with module path for v3
|
||||
|
||||
## 3.0.0 (2019-09-12)
|
||||
|
||||
This is a major release of the semver package which includes API changes. The Go
|
||||
API is compatible with ^1. The Go API was not changed because many people are using
|
||||
`go get` without Go modules for their applications and API breaking changes cause
|
||||
errors which we have or would need to support.
|
||||
|
||||
The changes in this release are the handling based on the data passed into the
|
||||
functions. These are described in the added and changed sections below.
|
||||
|
||||
### Added
|
||||
|
||||
- StrictNewVersion function. This is similar to NewVersion but will return an
|
||||
error if the version passed in is not a strict semantic version. For example,
|
||||
1.2.3 would pass but v1.2.3 or 1.2 would fail because they are not strictly
|
||||
speaking semantic versions. This function is faster, performs fewer operations,
|
||||
and uses fewer allocations than NewVersion.
|
||||
- Fuzzing has been performed on NewVersion, StrictNewVersion, and NewConstraint.
|
||||
The Makefile contains the operations used. For more information on you can start
|
||||
on Wikipedia at https://en.wikipedia.org/wiki/Fuzzing
|
||||
- Now using Go modules
|
||||
|
||||
### Changed
|
||||
|
||||
- NewVersion has proper prerelease and metadata validation with error messages
|
||||
to signal an issue with either of them
|
||||
- ^ now operates using a similar set of rules to npm/js and Rust/Cargo. If the
|
||||
version is >=1 the ^ ranges works the same as v1. For major versions of 0 the
|
||||
rules have changed. The minor version is treated as the stable version unless
|
||||
a patch is specified and then it is equivalent to =. One difference from npm/js
|
||||
is that prereleases there are only to a specific version (e.g. 1.2.3).
|
||||
Prereleases here look over multiple versions and follow semantic version
|
||||
ordering rules. This pattern now follows along with the expected and requested
|
||||
handling of this packaged by numerous users.
|
||||
|
||||
## 1.5.0 (2019-09-11)
|
||||
|
||||
### Added
|
||||
|
||||
- #103: Add basic fuzzing for `NewVersion()` (thanks @jesse-c)
|
||||
|
||||
### Changed
|
||||
|
||||
- #82: Clarify wildcard meaning in range constraints and update tests for it (thanks @greysteil)
|
||||
- #83: Clarify caret operator range for pre-1.0.0 dependencies (thanks @greysteil)
|
||||
- #72: Adding docs comment pointing to vert for a cli
|
||||
- #71: Update the docs on pre-release comparator handling
|
||||
- #89: Test with new go versions (thanks @thedevsaddam)
|
||||
- #87: Added $ to ValidPrerelease for better validation (thanks @jeremycarroll)
|
||||
|
||||
### Fixed
|
||||
|
||||
- #78: Fix unchecked error in example code (thanks @ravron)
|
||||
- #70: Fix the handling of pre-releases and the 0.0.0 release edge case
|
||||
- #97: Fixed copyright file for proper display on GitHub
|
||||
- #107: Fix handling prerelease when sorting alphanum and num
|
||||
- #109: Fixed where Validate sometimes returns wrong message on error
|
||||
|
||||
## 1.4.2 (2018-04-10)
|
||||
|
||||
### Changed
|
||||
|
||||
- #72: Updated the docs to point to vert for a console appliaction
|
||||
- #71: Update the docs on pre-release comparator handling
|
||||
|
||||
### Fixed
|
||||
|
||||
- #70: Fix the handling of pre-releases and the 0.0.0 release edge case
|
||||
|
||||
## 1.4.1 (2018-04-02)
|
||||
|
||||
### Fixed
|
||||
|
||||
- Fixed #64: Fix pre-release precedence issue (thanks @uudashr)
|
||||
|
||||
## 1.4.0 (2017-10-04)
|
||||
|
||||
### Changed
|
||||
|
||||
- #61: Update NewVersion to parse ints with a 64bit int size (thanks @zknill)
|
||||
|
||||
## 1.3.1 (2017-07-10)
|
||||
|
||||
### Fixed
|
||||
|
||||
- Fixed #57: number comparisons in prerelease sometimes inaccurate
|
||||
|
||||
## 1.3.0 (2017-05-02)
|
||||
|
||||
### Added
|
||||
|
||||
- #45: Added json (un)marshaling support (thanks @mh-cbon)
|
||||
- Stability marker. See https://masterminds.github.io/stability/
|
||||
|
||||
### Fixed
|
||||
|
||||
- #51: Fix handling of single digit tilde constraint (thanks @dgodd)
|
||||
|
||||
### Changed
|
||||
|
||||
- #55: The godoc icon moved from png to svg
|
||||
|
||||
## 1.2.3 (2017-04-03)
|
||||
|
||||
### Fixed
|
||||
|
||||
- #46: Fixed 0.x.x and 0.0.x in constraints being treated as *
|
||||
|
||||
## Release 1.2.2 (2016-12-13)
|
||||
|
||||
### Fixed
|
||||
|
||||
- #34: Fixed issue where hyphen range was not working with pre-release parsing.
|
||||
|
||||
## Release 1.2.1 (2016-11-28)
|
||||
|
||||
### Fixed
|
||||
|
||||
- #24: Fixed edge case issue where constraint "> 0" does not handle "0.0.1-alpha"
|
||||
properly.
|
||||
|
||||
## Release 1.2.0 (2016-11-04)
|
||||
|
||||
### Added
|
||||
|
||||
- #20: Added MustParse function for versions (thanks @adamreese)
|
||||
- #15: Added increment methods on versions (thanks @mh-cbon)
|
||||
|
||||
### Fixed
|
||||
|
||||
- Issue #21: Per the SemVer spec (section 9) a pre-release is unstable and
|
||||
might not satisfy the intended compatibility. The change here ignores pre-releases
|
||||
on constraint checks (e.g., ~ or ^) when a pre-release is not part of the
|
||||
constraint. For example, `^1.2.3` will ignore pre-releases while
|
||||
`^1.2.3-alpha` will include them.
|
||||
|
||||
## Release 1.1.1 (2016-06-30)
|
||||
|
||||
### Changed
|
||||
|
||||
- Issue #9: Speed up version comparison performance (thanks @sdboyer)
|
||||
- Issue #8: Added benchmarks (thanks @sdboyer)
|
||||
- Updated Go Report Card URL to new location
|
||||
- Updated Readme to add code snippet formatting (thanks @mh-cbon)
|
||||
- Updating tagging to v[SemVer] structure for compatibility with other tools.
|
||||
|
||||
## Release 1.1.0 (2016-03-11)
|
||||
|
||||
- Issue #2: Implemented validation to provide reasons a versions failed a
|
||||
constraint.
|
||||
|
||||
## Release 1.0.1 (2015-12-31)
|
||||
|
||||
- Fixed #1: * constraint failing on valid versions.
|
||||
|
||||
## Release 1.0.0 (2015-10-20)
|
||||
|
||||
- Initial release
|
||||
19
vendor/github.com/Masterminds/semver/v3/LICENSE.txt
generated
vendored
Normal file
19
vendor/github.com/Masterminds/semver/v3/LICENSE.txt
generated
vendored
Normal file
|
|
@ -0,0 +1,19 @@
|
|||
Copyright (C) 2014-2019, Matt Butcher and Matt Farina
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in
|
||||
all copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||
THE SOFTWARE.
|
||||
37
vendor/github.com/Masterminds/semver/v3/Makefile
generated
vendored
Normal file
37
vendor/github.com/Masterminds/semver/v3/Makefile
generated
vendored
Normal file
|
|
@ -0,0 +1,37 @@
|
|||
GOPATH=$(shell go env GOPATH)
|
||||
GOLANGCI_LINT=$(GOPATH)/bin/golangci-lint
|
||||
GOFUZZBUILD = $(GOPATH)/bin/go-fuzz-build
|
||||
GOFUZZ = $(GOPATH)/bin/go-fuzz
|
||||
|
||||
.PHONY: lint
|
||||
lint: $(GOLANGCI_LINT)
|
||||
@echo "==> Linting codebase"
|
||||
@$(GOLANGCI_LINT) run
|
||||
|
||||
.PHONY: test
|
||||
test:
|
||||
@echo "==> Running tests"
|
||||
GO111MODULE=on go test -v
|
||||
|
||||
.PHONY: test-cover
|
||||
test-cover:
|
||||
@echo "==> Running Tests with coverage"
|
||||
GO111MODULE=on go test -cover .
|
||||
|
||||
.PHONY: fuzz
|
||||
fuzz: $(GOFUZZBUILD) $(GOFUZZ)
|
||||
@echo "==> Fuzz testing"
|
||||
$(GOFUZZBUILD)
|
||||
$(GOFUZZ) -workdir=_fuzz
|
||||
|
||||
$(GOLANGCI_LINT):
|
||||
# Install golangci-lint. The configuration for it is in the .golangci.yml
|
||||
# file in the root of the repository
|
||||
echo ${GOPATH}
|
||||
curl -sfL https://install.goreleaser.com/github.com/golangci/golangci-lint.sh | sh -s -- -b $(GOPATH)/bin v1.17.1
|
||||
|
||||
$(GOFUZZBUILD):
|
||||
cd / && go get -u github.com/dvyukov/go-fuzz/go-fuzz-build
|
||||
|
||||
$(GOFUZZ):
|
||||
cd / && go get -u github.com/dvyukov/go-fuzz/go-fuzz github.com/dvyukov/go-fuzz/go-fuzz-dep
|
||||
244
vendor/github.com/Masterminds/semver/v3/README.md
generated
vendored
Normal file
244
vendor/github.com/Masterminds/semver/v3/README.md
generated
vendored
Normal file
|
|
@ -0,0 +1,244 @@
|
|||
# SemVer
|
||||
|
||||
The `semver` package provides the ability to work with [Semantic Versions](http://semver.org) in Go. Specifically it provides the ability to:
|
||||
|
||||
* Parse semantic versions
|
||||
* Sort semantic versions
|
||||
* Check if a semantic version fits within a set of constraints
|
||||
* Optionally work with a `v` prefix
|
||||
|
||||
[](https://masterminds.github.io/stability/active.html)
|
||||
[](https://github.com/Masterminds/semver/actions)
|
||||
[](https://pkg.go.dev/github.com/Masterminds/semver/v3)
|
||||
[](https://goreportcard.com/report/github.com/Masterminds/semver)
|
||||
|
||||
If you are looking for a command line tool for version comparisons please see
|
||||
[vert](https://github.com/Masterminds/vert) which uses this library.
|
||||
|
||||
## Package Versions
|
||||
|
||||
There are three major versions fo the `semver` package.
|
||||
|
||||
* 3.x.x is the new stable and active version. This version is focused on constraint
|
||||
compatibility for range handling in other tools from other languages. It has
|
||||
a similar API to the v1 releases. The development of this version is on the master
|
||||
branch. The documentation for this version is below.
|
||||
* 2.x was developed primarily for [dep](https://github.com/golang/dep). There are
|
||||
no tagged releases and the development was performed by [@sdboyer](https://github.com/sdboyer).
|
||||
There are API breaking changes from v1. This version lives on the [2.x branch](https://github.com/Masterminds/semver/tree/2.x).
|
||||
* 1.x.x is the most widely used version with numerous tagged releases. This is the
|
||||
previous stable and is still maintained for bug fixes. The development, to fix
|
||||
bugs, occurs on the release-1 branch. You can read the documentation [here](https://github.com/Masterminds/semver/blob/release-1/README.md).
|
||||
|
||||
## Parsing Semantic Versions
|
||||
|
||||
There are two functions that can parse semantic versions. The `StrictNewVersion`
|
||||
function only parses valid version 2 semantic versions as outlined in the
|
||||
specification. The `NewVersion` function attempts to coerce a version into a
|
||||
semantic version and parse it. For example, if there is a leading v or a version
|
||||
listed without all 3 parts (e.g. `v1.2`) it will attempt to coerce it into a valid
|
||||
semantic version (e.g., 1.2.0). In both cases a `Version` object is returned
|
||||
that can be sorted, compared, and used in constraints.
|
||||
|
||||
When parsing a version an error is returned if there is an issue parsing the
|
||||
version. For example,
|
||||
|
||||
v, err := semver.NewVersion("1.2.3-beta.1+build345")
|
||||
|
||||
The version object has methods to get the parts of the version, compare it to
|
||||
other versions, convert the version back into a string, and get the original
|
||||
string. Getting the original string is useful if the semantic version was coerced
|
||||
into a valid form.
|
||||
|
||||
## Sorting Semantic Versions
|
||||
|
||||
A set of versions can be sorted using the `sort` package from the standard library.
|
||||
For example,
|
||||
|
||||
```go
|
||||
raw := []string{"1.2.3", "1.0", "1.3", "2", "0.4.2",}
|
||||
vs := make([]*semver.Version, len(raw))
|
||||
for i, r := range raw {
|
||||
v, err := semver.NewVersion(r)
|
||||
if err != nil {
|
||||
t.Errorf("Error parsing version: %s", err)
|
||||
}
|
||||
|
||||
vs[i] = v
|
||||
}
|
||||
|
||||
sort.Sort(semver.Collection(vs))
|
||||
```
|
||||
|
||||
## Checking Version Constraints
|
||||
|
||||
There are two methods for comparing versions. One uses comparison methods on
|
||||
`Version` instances and the other uses `Constraints`. There are some important
|
||||
differences to notes between these two methods of comparison.
|
||||
|
||||
1. When two versions are compared using functions such as `Compare`, `LessThan`,
|
||||
and others it will follow the specification and always include prereleases
|
||||
within the comparison. It will provide an answer that is valid with the
|
||||
comparison section of the spec at https://semver.org/#spec-item-11
|
||||
2. When constraint checking is used for checks or validation it will follow a
|
||||
different set of rules that are common for ranges with tools like npm/js
|
||||
and Rust/Cargo. This includes considering prereleases to be invalid if the
|
||||
ranges does not include one. If you want to have it include pre-releases a
|
||||
simple solution is to include `-0` in your range.
|
||||
3. Constraint ranges can have some complex rules including the shorthand use of
|
||||
~ and ^. For more details on those see the options below.
|
||||
|
||||
There are differences between the two methods or checking versions because the
|
||||
comparison methods on `Version` follow the specification while comparison ranges
|
||||
are not part of the specification. Different packages and tools have taken it
|
||||
upon themselves to come up with range rules. This has resulted in differences.
|
||||
For example, npm/js and Cargo/Rust follow similar patterns while PHP has a
|
||||
different pattern for ^. The comparison features in this package follow the
|
||||
npm/js and Cargo/Rust lead because applications using it have followed similar
|
||||
patters with their versions.
|
||||
|
||||
Checking a version against version constraints is one of the most featureful
|
||||
parts of the package.
|
||||
|
||||
```go
|
||||
c, err := semver.NewConstraint(">= 1.2.3")
|
||||
if err != nil {
|
||||
// Handle constraint not being parsable.
|
||||
}
|
||||
|
||||
v, err := semver.NewVersion("1.3")
|
||||
if err != nil {
|
||||
// Handle version not being parsable.
|
||||
}
|
||||
// Check if the version meets the constraints. The a variable will be true.
|
||||
a := c.Check(v)
|
||||
```
|
||||
|
||||
### Basic Comparisons
|
||||
|
||||
There are two elements to the comparisons. First, a comparison string is a list
|
||||
of space or comma separated AND comparisons. These are then separated by || (OR)
|
||||
comparisons. For example, `">= 1.2 < 3.0.0 || >= 4.2.3"` is looking for a
|
||||
comparison that's greater than or equal to 1.2 and less than 3.0.0 or is
|
||||
greater than or equal to 4.2.3.
|
||||
|
||||
The basic comparisons are:
|
||||
|
||||
* `=`: equal (aliased to no operator)
|
||||
* `!=`: not equal
|
||||
* `>`: greater than
|
||||
* `<`: less than
|
||||
* `>=`: greater than or equal to
|
||||
* `<=`: less than or equal to
|
||||
|
||||
### Working With Prerelease Versions
|
||||
|
||||
Pre-releases, for those not familiar with them, are used for software releases
|
||||
prior to stable or generally available releases. Examples of prereleases include
|
||||
development, alpha, beta, and release candidate releases. A prerelease may be
|
||||
a version such as `1.2.3-beta.1` while the stable release would be `1.2.3`. In the
|
||||
order of precedence, prereleases come before their associated releases. In this
|
||||
example `1.2.3-beta.1 < 1.2.3`.
|
||||
|
||||
According to the Semantic Version specification prereleases may not be
|
||||
API compliant with their release counterpart. It says,
|
||||
|
||||
> A pre-release version indicates that the version is unstable and might not satisfy the intended compatibility requirements as denoted by its associated normal version.
|
||||
|
||||
SemVer comparisons using constraints without a prerelease comparator will skip
|
||||
prerelease versions. For example, `>=1.2.3` will skip prereleases when looking
|
||||
at a list of releases while `>=1.2.3-0` will evaluate and find prereleases.
|
||||
|
||||
The reason for the `0` as a pre-release version in the example comparison is
|
||||
because pre-releases can only contain ASCII alphanumerics and hyphens (along with
|
||||
`.` separators), per the spec. Sorting happens in ASCII sort order, again per the
|
||||
spec. The lowest character is a `0` in ASCII sort order
|
||||
(see an [ASCII Table](http://www.asciitable.com/))
|
||||
|
||||
Understanding ASCII sort ordering is important because A-Z comes before a-z. That
|
||||
means `>=1.2.3-BETA` will return `1.2.3-alpha`. What you might expect from case
|
||||
sensitivity doesn't apply here. This is due to ASCII sort ordering which is what
|
||||
the spec specifies.
|
||||
|
||||
### Hyphen Range Comparisons
|
||||
|
||||
There are multiple methods to handle ranges and the first is hyphens ranges.
|
||||
These look like:
|
||||
|
||||
* `1.2 - 1.4.5` which is equivalent to `>= 1.2 <= 1.4.5`
|
||||
* `2.3.4 - 4.5` which is equivalent to `>= 2.3.4 <= 4.5`
|
||||
|
||||
### Wildcards In Comparisons
|
||||
|
||||
The `x`, `X`, and `*` characters can be used as a wildcard character. This works
|
||||
for all comparison operators. When used on the `=` operator it falls
|
||||
back to the patch level comparison (see tilde below). For example,
|
||||
|
||||
* `1.2.x` is equivalent to `>= 1.2.0, < 1.3.0`
|
||||
* `>= 1.2.x` is equivalent to `>= 1.2.0`
|
||||
* `<= 2.x` is equivalent to `< 3`
|
||||
* `*` is equivalent to `>= 0.0.0`
|
||||
|
||||
### Tilde Range Comparisons (Patch)
|
||||
|
||||
The tilde (`~`) comparison operator is for patch level ranges when a minor
|
||||
version is specified and major level changes when the minor number is missing.
|
||||
For example,
|
||||
|
||||
* `~1.2.3` is equivalent to `>= 1.2.3, < 1.3.0`
|
||||
* `~1` is equivalent to `>= 1, < 2`
|
||||
* `~2.3` is equivalent to `>= 2.3, < 2.4`
|
||||
* `~1.2.x` is equivalent to `>= 1.2.0, < 1.3.0`
|
||||
* `~1.x` is equivalent to `>= 1, < 2`
|
||||
|
||||
### Caret Range Comparisons (Major)
|
||||
|
||||
The caret (`^`) comparison operator is for major level changes once a stable
|
||||
(1.0.0) release has occurred. Prior to a 1.0.0 release the minor versions acts
|
||||
as the API stability level. This is useful when comparisons of API versions as a
|
||||
major change is API breaking. For example,
|
||||
|
||||
* `^1.2.3` is equivalent to `>= 1.2.3, < 2.0.0`
|
||||
* `^1.2.x` is equivalent to `>= 1.2.0, < 2.0.0`
|
||||
* `^2.3` is equivalent to `>= 2.3, < 3`
|
||||
* `^2.x` is equivalent to `>= 2.0.0, < 3`
|
||||
* `^0.2.3` is equivalent to `>=0.2.3 <0.3.0`
|
||||
* `^0.2` is equivalent to `>=0.2.0 <0.3.0`
|
||||
* `^0.0.3` is equivalent to `>=0.0.3 <0.0.4`
|
||||
* `^0.0` is equivalent to `>=0.0.0 <0.1.0`
|
||||
* `^0` is equivalent to `>=0.0.0 <1.0.0`
|
||||
|
||||
## Validation
|
||||
|
||||
In addition to testing a version against a constraint, a version can be validated
|
||||
against a constraint. When validation fails a slice of errors containing why a
|
||||
version didn't meet the constraint is returned. For example,
|
||||
|
||||
```go
|
||||
c, err := semver.NewConstraint("<= 1.2.3, >= 1.4")
|
||||
if err != nil {
|
||||
// Handle constraint not being parseable.
|
||||
}
|
||||
|
||||
v, err := semver.NewVersion("1.3")
|
||||
if err != nil {
|
||||
// Handle version not being parseable.
|
||||
}
|
||||
|
||||
// Validate a version against a constraint.
|
||||
a, msgs := c.Validate(v)
|
||||
// a is false
|
||||
for _, m := range msgs {
|
||||
fmt.Println(m)
|
||||
|
||||
// Loops over the errors which would read
|
||||
// "1.3 is greater than 1.2.3"
|
||||
// "1.3 is less than 1.4"
|
||||
}
|
||||
```
|
||||
|
||||
## Contribute
|
||||
|
||||
If you find an issue or want to contribute please file an [issue](https://github.com/Masterminds/semver/issues)
|
||||
or [create a pull request](https://github.com/Masterminds/semver/pulls).
|
||||
24
vendor/github.com/Masterminds/semver/v3/collection.go
generated
vendored
Normal file
24
vendor/github.com/Masterminds/semver/v3/collection.go
generated
vendored
Normal file
|
|
@ -0,0 +1,24 @@
|
|||
package semver
|
||||
|
||||
// Collection is a collection of Version instances and implements the sort
|
||||
// interface. See the sort package for more details.
|
||||
// https://golang.org/pkg/sort/
|
||||
type Collection []*Version
|
||||
|
||||
// Len returns the length of a collection. The number of Version instances
|
||||
// on the slice.
|
||||
func (c Collection) Len() int {
|
||||
return len(c)
|
||||
}
|
||||
|
||||
// Less is needed for the sort interface to compare two Version objects on the
|
||||
// slice. If checks if one is less than the other.
|
||||
func (c Collection) Less(i, j int) bool {
|
||||
return c[i].LessThan(c[j])
|
||||
}
|
||||
|
||||
// Swap is needed for the sort interface to replace the Version objects
|
||||
// at two different positions in the slice.
|
||||
func (c Collection) Swap(i, j int) {
|
||||
c[i], c[j] = c[j], c[i]
|
||||
}
|
||||
594
vendor/github.com/Masterminds/semver/v3/constraints.go
generated
vendored
Normal file
594
vendor/github.com/Masterminds/semver/v3/constraints.go
generated
vendored
Normal file
|
|
@ -0,0 +1,594 @@
|
|||
package semver
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"errors"
|
||||
"fmt"
|
||||
"regexp"
|
||||
"strings"
|
||||
)
|
||||
|
||||
// Constraints is one or more constraint that a semantic version can be
|
||||
// checked against.
|
||||
type Constraints struct {
|
||||
constraints [][]*constraint
|
||||
}
|
||||
|
||||
// NewConstraint returns a Constraints instance that a Version instance can
|
||||
// be checked against. If there is a parse error it will be returned.
|
||||
func NewConstraint(c string) (*Constraints, error) {
|
||||
|
||||
// Rewrite - ranges into a comparison operation.
|
||||
c = rewriteRange(c)
|
||||
|
||||
ors := strings.Split(c, "||")
|
||||
or := make([][]*constraint, len(ors))
|
||||
for k, v := range ors {
|
||||
|
||||
// TODO: Find a way to validate and fetch all the constraints in a simpler form
|
||||
|
||||
// Validate the segment
|
||||
if !validConstraintRegex.MatchString(v) {
|
||||
return nil, fmt.Errorf("improper constraint: %s", v)
|
||||
}
|
||||
|
||||
cs := findConstraintRegex.FindAllString(v, -1)
|
||||
if cs == nil {
|
||||
cs = append(cs, v)
|
||||
}
|
||||
result := make([]*constraint, len(cs))
|
||||
for i, s := range cs {
|
||||
pc, err := parseConstraint(s)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
result[i] = pc
|
||||
}
|
||||
or[k] = result
|
||||
}
|
||||
|
||||
o := &Constraints{constraints: or}
|
||||
return o, nil
|
||||
}
|
||||
|
||||
// Check tests if a version satisfies the constraints.
|
||||
func (cs Constraints) Check(v *Version) bool {
|
||||
// TODO(mattfarina): For v4 of this library consolidate the Check and Validate
|
||||
// functions as the underlying functions make that possible now.
|
||||
// loop over the ORs and check the inner ANDs
|
||||
for _, o := range cs.constraints {
|
||||
joy := true
|
||||
for _, c := range o {
|
||||
if check, _ := c.check(v); !check {
|
||||
joy = false
|
||||
break
|
||||
}
|
||||
}
|
||||
|
||||
if joy {
|
||||
return true
|
||||
}
|
||||
}
|
||||
|
||||
return false
|
||||
}
|
||||
|
||||
// Validate checks if a version satisfies a constraint. If not a slice of
|
||||
// reasons for the failure are returned in addition to a bool.
|
||||
func (cs Constraints) Validate(v *Version) (bool, []error) {
|
||||
// loop over the ORs and check the inner ANDs
|
||||
var e []error
|
||||
|
||||
// Capture the prerelease message only once. When it happens the first time
|
||||
// this var is marked
|
||||
var prerelesase bool
|
||||
for _, o := range cs.constraints {
|
||||
joy := true
|
||||
for _, c := range o {
|
||||
// Before running the check handle the case there the version is
|
||||
// a prerelease and the check is not searching for prereleases.
|
||||
if c.con.pre == "" && v.pre != "" {
|
||||
if !prerelesase {
|
||||
em := fmt.Errorf("%s is a prerelease version and the constraint is only looking for release versions", v)
|
||||
e = append(e, em)
|
||||
prerelesase = true
|
||||
}
|
||||
joy = false
|
||||
|
||||
} else {
|
||||
|
||||
if _, err := c.check(v); err != nil {
|
||||
e = append(e, err)
|
||||
joy = false
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if joy {
|
||||
return true, []error{}
|
||||
}
|
||||
}
|
||||
|
||||
return false, e
|
||||
}
|
||||
|
||||
func (cs Constraints) String() string {
|
||||
buf := make([]string, len(cs.constraints))
|
||||
var tmp bytes.Buffer
|
||||
|
||||
for k, v := range cs.constraints {
|
||||
tmp.Reset()
|
||||
vlen := len(v)
|
||||
for kk, c := range v {
|
||||
tmp.WriteString(c.string())
|
||||
|
||||
// Space separate the AND conditions
|
||||
if vlen > 1 && kk < vlen-1 {
|
||||
tmp.WriteString(" ")
|
||||
}
|
||||
}
|
||||
buf[k] = tmp.String()
|
||||
}
|
||||
|
||||
return strings.Join(buf, " || ")
|
||||
}
|
||||
|
||||
// UnmarshalText implements the encoding.TextUnmarshaler interface.
|
||||
func (cs *Constraints) UnmarshalText(text []byte) error {
|
||||
temp, err := NewConstraint(string(text))
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
*cs = *temp
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
// MarshalText implements the encoding.TextMarshaler interface.
|
||||
func (cs Constraints) MarshalText() ([]byte, error) {
|
||||
return []byte(cs.String()), nil
|
||||
}
|
||||
|
||||
var constraintOps map[string]cfunc
|
||||
var constraintRegex *regexp.Regexp
|
||||
var constraintRangeRegex *regexp.Regexp
|
||||
|
||||
// Used to find individual constraints within a multi-constraint string
|
||||
var findConstraintRegex *regexp.Regexp
|
||||
|
||||
// Used to validate an segment of ANDs is valid
|
||||
var validConstraintRegex *regexp.Regexp
|
||||
|
||||
const cvRegex string = `v?([0-9|x|X|\*]+)(\.[0-9|x|X|\*]+)?(\.[0-9|x|X|\*]+)?` +
|
||||
`(-([0-9A-Za-z\-]+(\.[0-9A-Za-z\-]+)*))?` +
|
||||
`(\+([0-9A-Za-z\-]+(\.[0-9A-Za-z\-]+)*))?`
|
||||
|
||||
func init() {
|
||||
constraintOps = map[string]cfunc{
|
||||
"": constraintTildeOrEqual,
|
||||
"=": constraintTildeOrEqual,
|
||||
"!=": constraintNotEqual,
|
||||
">": constraintGreaterThan,
|
||||
"<": constraintLessThan,
|
||||
">=": constraintGreaterThanEqual,
|
||||
"=>": constraintGreaterThanEqual,
|
||||
"<=": constraintLessThanEqual,
|
||||
"=<": constraintLessThanEqual,
|
||||
"~": constraintTilde,
|
||||
"~>": constraintTilde,
|
||||
"^": constraintCaret,
|
||||
}
|
||||
|
||||
ops := `=||!=|>|<|>=|=>|<=|=<|~|~>|\^`
|
||||
|
||||
constraintRegex = regexp.MustCompile(fmt.Sprintf(
|
||||
`^\s*(%s)\s*(%s)\s*$`,
|
||||
ops,
|
||||
cvRegex))
|
||||
|
||||
constraintRangeRegex = regexp.MustCompile(fmt.Sprintf(
|
||||
`\s*(%s)\s+-\s+(%s)\s*`,
|
||||
cvRegex, cvRegex))
|
||||
|
||||
findConstraintRegex = regexp.MustCompile(fmt.Sprintf(
|
||||
`(%s)\s*(%s)`,
|
||||
ops,
|
||||
cvRegex))
|
||||
|
||||
// The first time a constraint shows up will look slightly different from
|
||||
// future times it shows up due to a leading space or comma in a given
|
||||
// string.
|
||||
validConstraintRegex = regexp.MustCompile(fmt.Sprintf(
|
||||
`^(\s*(%s)\s*(%s)\s*)((?:\s+|,\s*)(%s)\s*(%s)\s*)*$`,
|
||||
ops,
|
||||
cvRegex,
|
||||
ops,
|
||||
cvRegex))
|
||||
}
|
||||
|
||||
// An individual constraint
|
||||
type constraint struct {
|
||||
// The version used in the constraint check. For example, if a constraint
|
||||
// is '<= 2.0.0' the con a version instance representing 2.0.0.
|
||||
con *Version
|
||||
|
||||
// The original parsed version (e.g., 4.x from != 4.x)
|
||||
orig string
|
||||
|
||||
// The original operator for the constraint
|
||||
origfunc string
|
||||
|
||||
// When an x is used as part of the version (e.g., 1.x)
|
||||
minorDirty bool
|
||||
dirty bool
|
||||
patchDirty bool
|
||||
}
|
||||
|
||||
// Check if a version meets the constraint
|
||||
func (c *constraint) check(v *Version) (bool, error) {
|
||||
return constraintOps[c.origfunc](v, c)
|
||||
}
|
||||
|
||||
// String prints an individual constraint into a string
|
||||
func (c *constraint) string() string {
|
||||
return c.origfunc + c.orig
|
||||
}
|
||||
|
||||
type cfunc func(v *Version, c *constraint) (bool, error)
|
||||
|
||||
func parseConstraint(c string) (*constraint, error) {
|
||||
if len(c) > 0 {
|
||||
m := constraintRegex.FindStringSubmatch(c)
|
||||
if m == nil {
|
||||
return nil, fmt.Errorf("improper constraint: %s", c)
|
||||
}
|
||||
|
||||
cs := &constraint{
|
||||
orig: m[2],
|
||||
origfunc: m[1],
|
||||
}
|
||||
|
||||
ver := m[2]
|
||||
minorDirty := false
|
||||
patchDirty := false
|
||||
dirty := false
|
||||
if isX(m[3]) || m[3] == "" {
|
||||
ver = fmt.Sprintf("0.0.0%s", m[6])
|
||||
dirty = true
|
||||
} else if isX(strings.TrimPrefix(m[4], ".")) || m[4] == "" {
|
||||
minorDirty = true
|
||||
dirty = true
|
||||
ver = fmt.Sprintf("%s.0.0%s", m[3], m[6])
|
||||
} else if isX(strings.TrimPrefix(m[5], ".")) || m[5] == "" {
|
||||
dirty = true
|
||||
patchDirty = true
|
||||
ver = fmt.Sprintf("%s%s.0%s", m[3], m[4], m[6])
|
||||
}
|
||||
|
||||
con, err := NewVersion(ver)
|
||||
if err != nil {
|
||||
|
||||
// The constraintRegex should catch any regex parsing errors. So,
|
||||
// we should never get here.
|
||||
return nil, errors.New("constraint Parser Error")
|
||||
}
|
||||
|
||||
cs.con = con
|
||||
cs.minorDirty = minorDirty
|
||||
cs.patchDirty = patchDirty
|
||||
cs.dirty = dirty
|
||||
|
||||
return cs, nil
|
||||
}
|
||||
|
||||
// The rest is the special case where an empty string was passed in which
|
||||
// is equivalent to * or >=0.0.0
|
||||
con, err := StrictNewVersion("0.0.0")
|
||||
if err != nil {
|
||||
|
||||
// The constraintRegex should catch any regex parsing errors. So,
|
||||
// we should never get here.
|
||||
return nil, errors.New("constraint Parser Error")
|
||||
}
|
||||
|
||||
cs := &constraint{
|
||||
con: con,
|
||||
orig: c,
|
||||
origfunc: "",
|
||||
minorDirty: false,
|
||||
patchDirty: false,
|
||||
dirty: true,
|
||||
}
|
||||
return cs, nil
|
||||
}
|
||||
|
||||
// Constraint functions
|
||||
func constraintNotEqual(v *Version, c *constraint) (bool, error) {
|
||||
if c.dirty {
|
||||
|
||||
// If there is a pre-release on the version but the constraint isn't looking
|
||||
// for them assume that pre-releases are not compatible. See issue 21 for
|
||||
// more details.
|
||||
if v.Prerelease() != "" && c.con.Prerelease() == "" {
|
||||
return false, fmt.Errorf("%s is a prerelease version and the constraint is only looking for release versions", v)
|
||||
}
|
||||
|
||||
if c.con.Major() != v.Major() {
|
||||
return true, nil
|
||||
}
|
||||
if c.con.Minor() != v.Minor() && !c.minorDirty {
|
||||
return true, nil
|
||||
} else if c.minorDirty {
|
||||
return false, fmt.Errorf("%s is equal to %s", v, c.orig)
|
||||
} else if c.con.Patch() != v.Patch() && !c.patchDirty {
|
||||
return true, nil
|
||||
} else if c.patchDirty {
|
||||
// Need to handle prereleases if present
|
||||
if v.Prerelease() != "" || c.con.Prerelease() != "" {
|
||||
eq := comparePrerelease(v.Prerelease(), c.con.Prerelease()) != 0
|
||||
if eq {
|
||||
return true, nil
|
||||
}
|
||||
return false, fmt.Errorf("%s is equal to %s", v, c.orig)
|
||||
}
|
||||
return false, fmt.Errorf("%s is equal to %s", v, c.orig)
|
||||
}
|
||||
}
|
||||
|
||||
eq := v.Equal(c.con)
|
||||
if eq {
|
||||
return false, fmt.Errorf("%s is equal to %s", v, c.orig)
|
||||
}
|
||||
|
||||
return true, nil
|
||||
}
|
||||
|
||||
func constraintGreaterThan(v *Version, c *constraint) (bool, error) {
|
||||
|
||||
// If there is a pre-release on the version but the constraint isn't looking
|
||||
// for them assume that pre-releases are not compatible. See issue 21 for
|
||||
// more details.
|
||||
if v.Prerelease() != "" && c.con.Prerelease() == "" {
|
||||
return false, fmt.Errorf("%s is a prerelease version and the constraint is only looking for release versions", v)
|
||||
}
|
||||
|
||||
var eq bool
|
||||
|
||||
if !c.dirty {
|
||||
eq = v.Compare(c.con) == 1
|
||||
if eq {
|
||||
return true, nil
|
||||
}
|
||||
return false, fmt.Errorf("%s is less than or equal to %s", v, c.orig)
|
||||
}
|
||||
|
||||
if v.Major() > c.con.Major() {
|
||||
return true, nil
|
||||
} else if v.Major() < c.con.Major() {
|
||||
return false, fmt.Errorf("%s is less than or equal to %s", v, c.orig)
|
||||
} else if c.minorDirty {
|
||||
// This is a range case such as >11. When the version is something like
|
||||
// 11.1.0 is it not > 11. For that we would need 12 or higher
|
||||
return false, fmt.Errorf("%s is less than or equal to %s", v, c.orig)
|
||||
} else if c.patchDirty {
|
||||
// This is for ranges such as >11.1. A version of 11.1.1 is not greater
|
||||
// which one of 11.2.1 is greater
|
||||
eq = v.Minor() > c.con.Minor()
|
||||
if eq {
|
||||
return true, nil
|
||||
}
|
||||
return false, fmt.Errorf("%s is less than or equal to %s", v, c.orig)
|
||||
}
|
||||
|
||||
// If we have gotten here we are not comparing pre-preleases and can use the
|
||||
// Compare function to accomplish that.
|
||||
eq = v.Compare(c.con) == 1
|
||||
if eq {
|
||||
return true, nil
|
||||
}
|
||||
return false, fmt.Errorf("%s is less than or equal to %s", v, c.orig)
|
||||
}
|
||||
|
||||
func constraintLessThan(v *Version, c *constraint) (bool, error) {
|
||||
// If there is a pre-release on the version but the constraint isn't looking
|
||||
// for them assume that pre-releases are not compatible. See issue 21 for
|
||||
// more details.
|
||||
if v.Prerelease() != "" && c.con.Prerelease() == "" {
|
||||
return false, fmt.Errorf("%s is a prerelease version and the constraint is only looking for release versions", v)
|
||||
}
|
||||
|
||||
eq := v.Compare(c.con) < 0
|
||||
if eq {
|
||||
return true, nil
|
||||
}
|
||||
return false, fmt.Errorf("%s is greater than or equal to %s", v, c.orig)
|
||||
}
|
||||
|
||||
func constraintGreaterThanEqual(v *Version, c *constraint) (bool, error) {
|
||||
|
||||
// If there is a pre-release on the version but the constraint isn't looking
|
||||
// for them assume that pre-releases are not compatible. See issue 21 for
|
||||
// more details.
|
||||
if v.Prerelease() != "" && c.con.Prerelease() == "" {
|
||||
return false, fmt.Errorf("%s is a prerelease version and the constraint is only looking for release versions", v)
|
||||
}
|
||||
|
||||
eq := v.Compare(c.con) >= 0
|
||||
if eq {
|
||||
return true, nil
|
||||
}
|
||||
return false, fmt.Errorf("%s is less than %s", v, c.orig)
|
||||
}
|
||||
|
||||
func constraintLessThanEqual(v *Version, c *constraint) (bool, error) {
|
||||
// If there is a pre-release on the version but the constraint isn't looking
|
||||
// for them assume that pre-releases are not compatible. See issue 21 for
|
||||
// more details.
|
||||
if v.Prerelease() != "" && c.con.Prerelease() == "" {
|
||||
return false, fmt.Errorf("%s is a prerelease version and the constraint is only looking for release versions", v)
|
||||
}
|
||||
|
||||
var eq bool
|
||||
|
||||
if !c.dirty {
|
||||
eq = v.Compare(c.con) <= 0
|
||||
if eq {
|
||||
return true, nil
|
||||
}
|
||||
return false, fmt.Errorf("%s is greater than %s", v, c.orig)
|
||||
}
|
||||
|
||||
if v.Major() > c.con.Major() {
|
||||
return false, fmt.Errorf("%s is greater than %s", v, c.orig)
|
||||
} else if v.Major() == c.con.Major() && v.Minor() > c.con.Minor() && !c.minorDirty {
|
||||
return false, fmt.Errorf("%s is greater than %s", v, c.orig)
|
||||
}
|
||||
|
||||
return true, nil
|
||||
}
|
||||
|
||||
// ~*, ~>* --> >= 0.0.0 (any)
|
||||
// ~2, ~2.x, ~2.x.x, ~>2, ~>2.x ~>2.x.x --> >=2.0.0, <3.0.0
|
||||
// ~2.0, ~2.0.x, ~>2.0, ~>2.0.x --> >=2.0.0, <2.1.0
|
||||
// ~1.2, ~1.2.x, ~>1.2, ~>1.2.x --> >=1.2.0, <1.3.0
|
||||
// ~1.2.3, ~>1.2.3 --> >=1.2.3, <1.3.0
|
||||
// ~1.2.0, ~>1.2.0 --> >=1.2.0, <1.3.0
|
||||
func constraintTilde(v *Version, c *constraint) (bool, error) {
|
||||
// If there is a pre-release on the version but the constraint isn't looking
|
||||
// for them assume that pre-releases are not compatible. See issue 21 for
|
||||
// more details.
|
||||
if v.Prerelease() != "" && c.con.Prerelease() == "" {
|
||||
return false, fmt.Errorf("%s is a prerelease version and the constraint is only looking for release versions", v)
|
||||
}
|
||||
|
||||
if v.LessThan(c.con) {
|
||||
return false, fmt.Errorf("%s is less than %s", v, c.orig)
|
||||
}
|
||||
|
||||
// ~0.0.0 is a special case where all constraints are accepted. It's
|
||||
// equivalent to >= 0.0.0.
|
||||
if c.con.Major() == 0 && c.con.Minor() == 0 && c.con.Patch() == 0 &&
|
||||
!c.minorDirty && !c.patchDirty {
|
||||
return true, nil
|
||||
}
|
||||
|
||||
if v.Major() != c.con.Major() {
|
||||
return false, fmt.Errorf("%s does not have same major version as %s", v, c.orig)
|
||||
}
|
||||
|
||||
if v.Minor() != c.con.Minor() && !c.minorDirty {
|
||||
return false, fmt.Errorf("%s does not have same major and minor version as %s", v, c.orig)
|
||||
}
|
||||
|
||||
return true, nil
|
||||
}
|
||||
|
||||
// When there is a .x (dirty) status it automatically opts in to ~. Otherwise
|
||||
// it's a straight =
|
||||
func constraintTildeOrEqual(v *Version, c *constraint) (bool, error) {
|
||||
// If there is a pre-release on the version but the constraint isn't looking
|
||||
// for them assume that pre-releases are not compatible. See issue 21 for
|
||||
// more details.
|
||||
if v.Prerelease() != "" && c.con.Prerelease() == "" {
|
||||
return false, fmt.Errorf("%s is a prerelease version and the constraint is only looking for release versions", v)
|
||||
}
|
||||
|
||||
if c.dirty {
|
||||
return constraintTilde(v, c)
|
||||
}
|
||||
|
||||
eq := v.Equal(c.con)
|
||||
if eq {
|
||||
return true, nil
|
||||
}
|
||||
|
||||
return false, fmt.Errorf("%s is not equal to %s", v, c.orig)
|
||||
}
|
||||
|
||||
// ^* --> (any)
|
||||
// ^1.2.3 --> >=1.2.3 <2.0.0
|
||||
// ^1.2 --> >=1.2.0 <2.0.0
|
||||
// ^1 --> >=1.0.0 <2.0.0
|
||||
// ^0.2.3 --> >=0.2.3 <0.3.0
|
||||
// ^0.2 --> >=0.2.0 <0.3.0
|
||||
// ^0.0.3 --> >=0.0.3 <0.0.4
|
||||
// ^0.0 --> >=0.0.0 <0.1.0
|
||||
// ^0 --> >=0.0.0 <1.0.0
|
||||
func constraintCaret(v *Version, c *constraint) (bool, error) {
|
||||
// If there is a pre-release on the version but the constraint isn't looking
|
||||
// for them assume that pre-releases are not compatible. See issue 21 for
|
||||
// more details.
|
||||
if v.Prerelease() != "" && c.con.Prerelease() == "" {
|
||||
return false, fmt.Errorf("%s is a prerelease version and the constraint is only looking for release versions", v)
|
||||
}
|
||||
|
||||
// This less than handles prereleases
|
||||
if v.LessThan(c.con) {
|
||||
return false, fmt.Errorf("%s is less than %s", v, c.orig)
|
||||
}
|
||||
|
||||
var eq bool
|
||||
|
||||
// ^ when the major > 0 is >=x.y.z < x+1
|
||||
if c.con.Major() > 0 || c.minorDirty {
|
||||
|
||||
// ^ has to be within a major range for > 0. Everything less than was
|
||||
// filtered out with the LessThan call above. This filters out those
|
||||
// that greater but not within the same major range.
|
||||
eq = v.Major() == c.con.Major()
|
||||
if eq {
|
||||
return true, nil
|
||||
}
|
||||
return false, fmt.Errorf("%s does not have same major version as %s", v, c.orig)
|
||||
}
|
||||
|
||||
// ^ when the major is 0 and minor > 0 is >=0.y.z < 0.y+1
|
||||
if c.con.Major() == 0 && v.Major() > 0 {
|
||||
return false, fmt.Errorf("%s does not have same major version as %s", v, c.orig)
|
||||
}
|
||||
// If the con Minor is > 0 it is not dirty
|
||||
if c.con.Minor() > 0 || c.patchDirty {
|
||||
eq = v.Minor() == c.con.Minor()
|
||||
if eq {
|
||||
return true, nil
|
||||
}
|
||||
return false, fmt.Errorf("%s does not have same minor version as %s. Expected minor versions to match when constraint major version is 0", v, c.orig)
|
||||
}
|
||||
// ^ when the minor is 0 and minor > 0 is =0.0.z
|
||||
if c.con.Minor() == 0 && v.Minor() > 0 {
|
||||
return false, fmt.Errorf("%s does not have same minor version as %s", v, c.orig)
|
||||
}
|
||||
|
||||
// At this point the major is 0 and the minor is 0 and not dirty. The patch
|
||||
// is not dirty so we need to check if they are equal. If they are not equal
|
||||
eq = c.con.Patch() == v.Patch()
|
||||
if eq {
|
||||
return true, nil
|
||||
}
|
||||
return false, fmt.Errorf("%s does not equal %s. Expect version and constraint to equal when major and minor versions are 0", v, c.orig)
|
||||
}
|
||||
|
||||
func isX(x string) bool {
|
||||
switch x {
|
||||
case "x", "*", "X":
|
||||
return true
|
||||
default:
|
||||
return false
|
||||
}
|
||||
}
|
||||
|
||||
func rewriteRange(i string) string {
|
||||
m := constraintRangeRegex.FindAllStringSubmatch(i, -1)
|
||||
if m == nil {
|
||||
return i
|
||||
}
|
||||
o := i
|
||||
for _, v := range m {
|
||||
t := fmt.Sprintf(">= %s, <= %s", v[1], v[11])
|
||||
o = strings.Replace(o, v[0], t, 1)
|
||||
}
|
||||
|
||||
return o
|
||||
}
|
||||
184
vendor/github.com/Masterminds/semver/v3/doc.go
generated
vendored
Normal file
184
vendor/github.com/Masterminds/semver/v3/doc.go
generated
vendored
Normal file
|
|
@ -0,0 +1,184 @@
|
|||
/*
|
||||
Package semver provides the ability to work with Semantic Versions (http://semver.org) in Go.
|
||||
|
||||
Specifically it provides the ability to:
|
||||
|
||||
- Parse semantic versions
|
||||
- Sort semantic versions
|
||||
- Check if a semantic version fits within a set of constraints
|
||||
- Optionally work with a `v` prefix
|
||||
|
||||
# Parsing Semantic Versions
|
||||
|
||||
There are two functions that can parse semantic versions. The `StrictNewVersion`
|
||||
function only parses valid version 2 semantic versions as outlined in the
|
||||
specification. The `NewVersion` function attempts to coerce a version into a
|
||||
semantic version and parse it. For example, if there is a leading v or a version
|
||||
listed without all 3 parts (e.g. 1.2) it will attempt to coerce it into a valid
|
||||
semantic version (e.g., 1.2.0). In both cases a `Version` object is returned
|
||||
that can be sorted, compared, and used in constraints.
|
||||
|
||||
When parsing a version an optional error can be returned if there is an issue
|
||||
parsing the version. For example,
|
||||
|
||||
v, err := semver.NewVersion("1.2.3-beta.1+b345")
|
||||
|
||||
The version object has methods to get the parts of the version, compare it to
|
||||
other versions, convert the version back into a string, and get the original
|
||||
string. For more details please see the documentation
|
||||
at https://godoc.org/github.com/Masterminds/semver.
|
||||
|
||||
# Sorting Semantic Versions
|
||||
|
||||
A set of versions can be sorted using the `sort` package from the standard library.
|
||||
For example,
|
||||
|
||||
raw := []string{"1.2.3", "1.0", "1.3", "2", "0.4.2",}
|
||||
vs := make([]*semver.Version, len(raw))
|
||||
for i, r := range raw {
|
||||
v, err := semver.NewVersion(r)
|
||||
if err != nil {
|
||||
t.Errorf("Error parsing version: %s", err)
|
||||
}
|
||||
|
||||
vs[i] = v
|
||||
}
|
||||
|
||||
sort.Sort(semver.Collection(vs))
|
||||
|
||||
# Checking Version Constraints and Comparing Versions
|
||||
|
||||
There are two methods for comparing versions. One uses comparison methods on
|
||||
`Version` instances and the other is using Constraints. There are some important
|
||||
differences to notes between these two methods of comparison.
|
||||
|
||||
1. When two versions are compared using functions such as `Compare`, `LessThan`,
|
||||
and others it will follow the specification and always include prereleases
|
||||
within the comparison. It will provide an answer valid with the comparison
|
||||
spec section at https://semver.org/#spec-item-11
|
||||
2. When constraint checking is used for checks or validation it will follow a
|
||||
different set of rules that are common for ranges with tools like npm/js
|
||||
and Rust/Cargo. This includes considering prereleases to be invalid if the
|
||||
ranges does not include on. If you want to have it include pre-releases a
|
||||
simple solution is to include `-0` in your range.
|
||||
3. Constraint ranges can have some complex rules including the shorthard use of
|
||||
~ and ^. For more details on those see the options below.
|
||||
|
||||
There are differences between the two methods or checking versions because the
|
||||
comparison methods on `Version` follow the specification while comparison ranges
|
||||
are not part of the specification. Different packages and tools have taken it
|
||||
upon themselves to come up with range rules. This has resulted in differences.
|
||||
For example, npm/js and Cargo/Rust follow similar patterns which PHP has a
|
||||
different pattern for ^. The comparison features in this package follow the
|
||||
npm/js and Cargo/Rust lead because applications using it have followed similar
|
||||
patters with their versions.
|
||||
|
||||
Checking a version against version constraints is one of the most featureful
|
||||
parts of the package.
|
||||
|
||||
c, err := semver.NewConstraint(">= 1.2.3")
|
||||
if err != nil {
|
||||
// Handle constraint not being parsable.
|
||||
}
|
||||
|
||||
v, err := semver.NewVersion("1.3")
|
||||
if err != nil {
|
||||
// Handle version not being parsable.
|
||||
}
|
||||
// Check if the version meets the constraints. The a variable will be true.
|
||||
a := c.Check(v)
|
||||
|
||||
# Basic Comparisons
|
||||
|
||||
There are two elements to the comparisons. First, a comparison string is a list
|
||||
of comma or space separated AND comparisons. These are then separated by || (OR)
|
||||
comparisons. For example, `">= 1.2 < 3.0.0 || >= 4.2.3"` is looking for a
|
||||
comparison that's greater than or equal to 1.2 and less than 3.0.0 or is
|
||||
greater than or equal to 4.2.3. This can also be written as
|
||||
`">= 1.2, < 3.0.0 || >= 4.2.3"`
|
||||
|
||||
The basic comparisons are:
|
||||
|
||||
- `=`: equal (aliased to no operator)
|
||||
- `!=`: not equal
|
||||
- `>`: greater than
|
||||
- `<`: less than
|
||||
- `>=`: greater than or equal to
|
||||
- `<=`: less than or equal to
|
||||
|
||||
# Hyphen Range Comparisons
|
||||
|
||||
There are multiple methods to handle ranges and the first is hyphens ranges.
|
||||
These look like:
|
||||
|
||||
- `1.2 - 1.4.5` which is equivalent to `>= 1.2, <= 1.4.5`
|
||||
- `2.3.4 - 4.5` which is equivalent to `>= 2.3.4 <= 4.5`
|
||||
|
||||
# Wildcards In Comparisons
|
||||
|
||||
The `x`, `X`, and `*` characters can be used as a wildcard character. This works
|
||||
for all comparison operators. When used on the `=` operator it falls
|
||||
back to the tilde operation. For example,
|
||||
|
||||
- `1.2.x` is equivalent to `>= 1.2.0 < 1.3.0`
|
||||
- `>= 1.2.x` is equivalent to `>= 1.2.0`
|
||||
- `<= 2.x` is equivalent to `<= 3`
|
||||
- `*` is equivalent to `>= 0.0.0`
|
||||
|
||||
Tilde Range Comparisons (Patch)
|
||||
|
||||
The tilde (`~`) comparison operator is for patch level ranges when a minor
|
||||
version is specified and major level changes when the minor number is missing.
|
||||
For example,
|
||||
|
||||
- `~1.2.3` is equivalent to `>= 1.2.3 < 1.3.0`
|
||||
- `~1` is equivalent to `>= 1, < 2`
|
||||
- `~2.3` is equivalent to `>= 2.3 < 2.4`
|
||||
- `~1.2.x` is equivalent to `>= 1.2.0 < 1.3.0`
|
||||
- `~1.x` is equivalent to `>= 1 < 2`
|
||||
|
||||
Caret Range Comparisons (Major)
|
||||
|
||||
The caret (`^`) comparison operator is for major level changes once a stable
|
||||
(1.0.0) release has occurred. Prior to a 1.0.0 release the minor versions acts
|
||||
as the API stability level. This is useful when comparisons of API versions as a
|
||||
major change is API breaking. For example,
|
||||
|
||||
- `^1.2.3` is equivalent to `>= 1.2.3, < 2.0.0`
|
||||
- `^1.2.x` is equivalent to `>= 1.2.0, < 2.0.0`
|
||||
- `^2.3` is equivalent to `>= 2.3, < 3`
|
||||
- `^2.x` is equivalent to `>= 2.0.0, < 3`
|
||||
- `^0.2.3` is equivalent to `>=0.2.3 <0.3.0`
|
||||
- `^0.2` is equivalent to `>=0.2.0 <0.3.0`
|
||||
- `^0.0.3` is equivalent to `>=0.0.3 <0.0.4`
|
||||
- `^0.0` is equivalent to `>=0.0.0 <0.1.0`
|
||||
- `^0` is equivalent to `>=0.0.0 <1.0.0`
|
||||
|
||||
# Validation
|
||||
|
||||
In addition to testing a version against a constraint, a version can be validated
|
||||
against a constraint. When validation fails a slice of errors containing why a
|
||||
version didn't meet the constraint is returned. For example,
|
||||
|
||||
c, err := semver.NewConstraint("<= 1.2.3, >= 1.4")
|
||||
if err != nil {
|
||||
// Handle constraint not being parseable.
|
||||
}
|
||||
|
||||
v, _ := semver.NewVersion("1.3")
|
||||
if err != nil {
|
||||
// Handle version not being parseable.
|
||||
}
|
||||
|
||||
// Validate a version against a constraint.
|
||||
a, msgs := c.Validate(v)
|
||||
// a is false
|
||||
for _, m := range msgs {
|
||||
fmt.Println(m)
|
||||
|
||||
// Loops over the errors which would read
|
||||
// "1.3 is greater than 1.2.3"
|
||||
// "1.3 is less than 1.4"
|
||||
}
|
||||
*/
|
||||
package semver
|
||||
22
vendor/github.com/Masterminds/semver/v3/fuzz.go
generated
vendored
Normal file
22
vendor/github.com/Masterminds/semver/v3/fuzz.go
generated
vendored
Normal file
|
|
@ -0,0 +1,22 @@
|
|||
// +build gofuzz
|
||||
|
||||
package semver
|
||||
|
||||
func Fuzz(data []byte) int {
|
||||
d := string(data)
|
||||
|
||||
// Test NewVersion
|
||||
_, _ = NewVersion(d)
|
||||
|
||||
// Test StrictNewVersion
|
||||
_, _ = StrictNewVersion(d)
|
||||
|
||||
// Test NewConstraint
|
||||
_, _ = NewConstraint(d)
|
||||
|
||||
// The return value should be 0 normally, 1 if the priority in future tests
|
||||
// should be increased, and -1 if future tests should skip passing in that
|
||||
// data. We do not have a reason to change priority so 0 is always returned.
|
||||
// There are example tests that do this.
|
||||
return 0
|
||||
}
|
||||
639
vendor/github.com/Masterminds/semver/v3/version.go
generated
vendored
Normal file
639
vendor/github.com/Masterminds/semver/v3/version.go
generated
vendored
Normal file
|
|
@ -0,0 +1,639 @@
|
|||
package semver
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"database/sql/driver"
|
||||
"encoding/json"
|
||||
"errors"
|
||||
"fmt"
|
||||
"regexp"
|
||||
"strconv"
|
||||
"strings"
|
||||
)
|
||||
|
||||
// The compiled version of the regex created at init() is cached here so it
|
||||
// only needs to be created once.
|
||||
var versionRegex *regexp.Regexp
|
||||
|
||||
var (
|
||||
// ErrInvalidSemVer is returned a version is found to be invalid when
|
||||
// being parsed.
|
||||
ErrInvalidSemVer = errors.New("Invalid Semantic Version")
|
||||
|
||||
// ErrEmptyString is returned when an empty string is passed in for parsing.
|
||||
ErrEmptyString = errors.New("Version string empty")
|
||||
|
||||
// ErrInvalidCharacters is returned when invalid characters are found as
|
||||
// part of a version
|
||||
ErrInvalidCharacters = errors.New("Invalid characters in version")
|
||||
|
||||
// ErrSegmentStartsZero is returned when a version segment starts with 0.
|
||||
// This is invalid in SemVer.
|
||||
ErrSegmentStartsZero = errors.New("Version segment starts with 0")
|
||||
|
||||
// ErrInvalidMetadata is returned when the metadata is an invalid format
|
||||
ErrInvalidMetadata = errors.New("Invalid Metadata string")
|
||||
|
||||
// ErrInvalidPrerelease is returned when the pre-release is an invalid format
|
||||
ErrInvalidPrerelease = errors.New("Invalid Prerelease string")
|
||||
)
|
||||
|
||||
// semVerRegex is the regular expression used to parse a semantic version.
|
||||
const semVerRegex string = `v?([0-9]+)(\.[0-9]+)?(\.[0-9]+)?` +
|
||||
`(-([0-9A-Za-z\-]+(\.[0-9A-Za-z\-]+)*))?` +
|
||||
`(\+([0-9A-Za-z\-]+(\.[0-9A-Za-z\-]+)*))?`
|
||||
|
||||
// Version represents a single semantic version.
|
||||
type Version struct {
|
||||
major, minor, patch uint64
|
||||
pre string
|
||||
metadata string
|
||||
original string
|
||||
}
|
||||
|
||||
func init() {
|
||||
versionRegex = regexp.MustCompile("^" + semVerRegex + "$")
|
||||
}
|
||||
|
||||
const (
|
||||
num string = "0123456789"
|
||||
allowed string = "abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ-" + num
|
||||
)
|
||||
|
||||
// StrictNewVersion parses a given version and returns an instance of Version or
|
||||
// an error if unable to parse the version. Only parses valid semantic versions.
|
||||
// Performs checking that can find errors within the version.
|
||||
// If you want to coerce a version such as 1 or 1.2 and parse it as the 1.x
|
||||
// releases of semver did, use the NewVersion() function.
|
||||
func StrictNewVersion(v string) (*Version, error) {
|
||||
// Parsing here does not use RegEx in order to increase performance and reduce
|
||||
// allocations.
|
||||
|
||||
if len(v) == 0 {
|
||||
return nil, ErrEmptyString
|
||||
}
|
||||
|
||||
// Split the parts into [0]major, [1]minor, and [2]patch,prerelease,build
|
||||
parts := strings.SplitN(v, ".", 3)
|
||||
if len(parts) != 3 {
|
||||
return nil, ErrInvalidSemVer
|
||||
}
|
||||
|
||||
sv := &Version{
|
||||
original: v,
|
||||
}
|
||||
|
||||
// check for prerelease or build metadata
|
||||
var extra []string
|
||||
if strings.ContainsAny(parts[2], "-+") {
|
||||
// Start with the build metadata first as it needs to be on the right
|
||||
extra = strings.SplitN(parts[2], "+", 2)
|
||||
if len(extra) > 1 {
|
||||
// build metadata found
|
||||
sv.metadata = extra[1]
|
||||
parts[2] = extra[0]
|
||||
}
|
||||
|
||||
extra = strings.SplitN(parts[2], "-", 2)
|
||||
if len(extra) > 1 {
|
||||
// prerelease found
|
||||
sv.pre = extra[1]
|
||||
parts[2] = extra[0]
|
||||
}
|
||||
}
|
||||
|
||||
// Validate the number segments are valid. This includes only having positive
|
||||
// numbers and no leading 0's.
|
||||
for _, p := range parts {
|
||||
if !containsOnly(p, num) {
|
||||
return nil, ErrInvalidCharacters
|
||||
}
|
||||
|
||||
if len(p) > 1 && p[0] == '0' {
|
||||
return nil, ErrSegmentStartsZero
|
||||
}
|
||||
}
|
||||
|
||||
// Extract the major, minor, and patch elements onto the returned Version
|
||||
var err error
|
||||
sv.major, err = strconv.ParseUint(parts[0], 10, 64)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
sv.minor, err = strconv.ParseUint(parts[1], 10, 64)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
sv.patch, err = strconv.ParseUint(parts[2], 10, 64)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
// No prerelease or build metadata found so returning now as a fastpath.
|
||||
if sv.pre == "" && sv.metadata == "" {
|
||||
return sv, nil
|
||||
}
|
||||
|
||||
if sv.pre != "" {
|
||||
if err = validatePrerelease(sv.pre); err != nil {
|
||||
return nil, err
|
||||
}
|
||||
}
|
||||
|
||||
if sv.metadata != "" {
|
||||
if err = validateMetadata(sv.metadata); err != nil {
|
||||
return nil, err
|
||||
}
|
||||
}
|
||||
|
||||
return sv, nil
|
||||
}
|
||||
|
||||
// NewVersion parses a given version and returns an instance of Version or
|
||||
// an error if unable to parse the version. If the version is SemVer-ish it
|
||||
// attempts to convert it to SemVer. If you want to validate it was a strict
|
||||
// semantic version at parse time see StrictNewVersion().
|
||||
func NewVersion(v string) (*Version, error) {
|
||||
m := versionRegex.FindStringSubmatch(v)
|
||||
if m == nil {
|
||||
return nil, ErrInvalidSemVer
|
||||
}
|
||||
|
||||
sv := &Version{
|
||||
metadata: m[8],
|
||||
pre: m[5],
|
||||
original: v,
|
||||
}
|
||||
|
||||
var err error
|
||||
sv.major, err = strconv.ParseUint(m[1], 10, 64)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("Error parsing version segment: %s", err)
|
||||
}
|
||||
|
||||
if m[2] != "" {
|
||||
sv.minor, err = strconv.ParseUint(strings.TrimPrefix(m[2], "."), 10, 64)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("Error parsing version segment: %s", err)
|
||||
}
|
||||
} else {
|
||||
sv.minor = 0
|
||||
}
|
||||
|
||||
if m[3] != "" {
|
||||
sv.patch, err = strconv.ParseUint(strings.TrimPrefix(m[3], "."), 10, 64)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("Error parsing version segment: %s", err)
|
||||
}
|
||||
} else {
|
||||
sv.patch = 0
|
||||
}
|
||||
|
||||
// Perform some basic due diligence on the extra parts to ensure they are
|
||||
// valid.
|
||||
|
||||
if sv.pre != "" {
|
||||
if err = validatePrerelease(sv.pre); err != nil {
|
||||
return nil, err
|
||||
}
|
||||
}
|
||||
|
||||
if sv.metadata != "" {
|
||||
if err = validateMetadata(sv.metadata); err != nil {
|
||||
return nil, err
|
||||
}
|
||||
}
|
||||
|
||||
return sv, nil
|
||||
}
|
||||
|
||||
// New creates a new instance of Version with each of the parts passed in as
|
||||
// arguments instead of parsing a version string.
|
||||
func New(major, minor, patch uint64, pre, metadata string) *Version {
|
||||
v := Version{
|
||||
major: major,
|
||||
minor: minor,
|
||||
patch: patch,
|
||||
pre: pre,
|
||||
metadata: metadata,
|
||||
original: "",
|
||||
}
|
||||
|
||||
v.original = v.String()
|
||||
|
||||
return &v
|
||||
}
|
||||
|
||||
// MustParse parses a given version and panics on error.
|
||||
func MustParse(v string) *Version {
|
||||
sv, err := NewVersion(v)
|
||||
if err != nil {
|
||||
panic(err)
|
||||
}
|
||||
return sv
|
||||
}
|
||||
|
||||
// String converts a Version object to a string.
|
||||
// Note, if the original version contained a leading v this version will not.
|
||||
// See the Original() method to retrieve the original value. Semantic Versions
|
||||
// don't contain a leading v per the spec. Instead it's optional on
|
||||
// implementation.
|
||||
func (v Version) String() string {
|
||||
var buf bytes.Buffer
|
||||
|
||||
fmt.Fprintf(&buf, "%d.%d.%d", v.major, v.minor, v.patch)
|
||||
if v.pre != "" {
|
||||
fmt.Fprintf(&buf, "-%s", v.pre)
|
||||
}
|
||||
if v.metadata != "" {
|
||||
fmt.Fprintf(&buf, "+%s", v.metadata)
|
||||
}
|
||||
|
||||
return buf.String()
|
||||
}
|
||||
|
||||
// Original returns the original value passed in to be parsed.
|
||||
func (v *Version) Original() string {
|
||||
return v.original
|
||||
}
|
||||
|
||||
// Major returns the major version.
|
||||
func (v Version) Major() uint64 {
|
||||
return v.major
|
||||
}
|
||||
|
||||
// Minor returns the minor version.
|
||||
func (v Version) Minor() uint64 {
|
||||
return v.minor
|
||||
}
|
||||
|
||||
// Patch returns the patch version.
|
||||
func (v Version) Patch() uint64 {
|
||||
return v.patch
|
||||
}
|
||||
|
||||
// Prerelease returns the pre-release version.
|
||||
func (v Version) Prerelease() string {
|
||||
return v.pre
|
||||
}
|
||||
|
||||
// Metadata returns the metadata on the version.
|
||||
func (v Version) Metadata() string {
|
||||
return v.metadata
|
||||
}
|
||||
|
||||
// originalVPrefix returns the original 'v' prefix if any.
|
||||
func (v Version) originalVPrefix() string {
|
||||
// Note, only lowercase v is supported as a prefix by the parser.
|
||||
if v.original != "" && v.original[:1] == "v" {
|
||||
return v.original[:1]
|
||||
}
|
||||
return ""
|
||||
}
|
||||
|
||||
// IncPatch produces the next patch version.
|
||||
// If the current version does not have prerelease/metadata information,
|
||||
// it unsets metadata and prerelease values, increments patch number.
|
||||
// If the current version has any of prerelease or metadata information,
|
||||
// it unsets both values and keeps current patch value
|
||||
func (v Version) IncPatch() Version {
|
||||
vNext := v
|
||||
// according to http://semver.org/#spec-item-9
|
||||
// Pre-release versions have a lower precedence than the associated normal version.
|
||||
// according to http://semver.org/#spec-item-10
|
||||
// Build metadata SHOULD be ignored when determining version precedence.
|
||||
if v.pre != "" {
|
||||
vNext.metadata = ""
|
||||
vNext.pre = ""
|
||||
} else {
|
||||
vNext.metadata = ""
|
||||
vNext.pre = ""
|
||||
vNext.patch = v.patch + 1
|
||||
}
|
||||
vNext.original = v.originalVPrefix() + "" + vNext.String()
|
||||
return vNext
|
||||
}
|
||||
|
||||
// IncMinor produces the next minor version.
|
||||
// Sets patch to 0.
|
||||
// Increments minor number.
|
||||
// Unsets metadata.
|
||||
// Unsets prerelease status.
|
||||
func (v Version) IncMinor() Version {
|
||||
vNext := v
|
||||
vNext.metadata = ""
|
||||
vNext.pre = ""
|
||||
vNext.patch = 0
|
||||
vNext.minor = v.minor + 1
|
||||
vNext.original = v.originalVPrefix() + "" + vNext.String()
|
||||
return vNext
|
||||
}
|
||||
|
||||
// IncMajor produces the next major version.
|
||||
// Sets patch to 0.
|
||||
// Sets minor to 0.
|
||||
// Increments major number.
|
||||
// Unsets metadata.
|
||||
// Unsets prerelease status.
|
||||
func (v Version) IncMajor() Version {
|
||||
vNext := v
|
||||
vNext.metadata = ""
|
||||
vNext.pre = ""
|
||||
vNext.patch = 0
|
||||
vNext.minor = 0
|
||||
vNext.major = v.major + 1
|
||||
vNext.original = v.originalVPrefix() + "" + vNext.String()
|
||||
return vNext
|
||||
}
|
||||
|
||||
// SetPrerelease defines the prerelease value.
|
||||
// Value must not include the required 'hyphen' prefix.
|
||||
func (v Version) SetPrerelease(prerelease string) (Version, error) {
|
||||
vNext := v
|
||||
if len(prerelease) > 0 {
|
||||
if err := validatePrerelease(prerelease); err != nil {
|
||||
return vNext, err
|
||||
}
|
||||
}
|
||||
vNext.pre = prerelease
|
||||
vNext.original = v.originalVPrefix() + "" + vNext.String()
|
||||
return vNext, nil
|
||||
}
|
||||
|
||||
// SetMetadata defines metadata value.
|
||||
// Value must not include the required 'plus' prefix.
|
||||
func (v Version) SetMetadata(metadata string) (Version, error) {
|
||||
vNext := v
|
||||
if len(metadata) > 0 {
|
||||
if err := validateMetadata(metadata); err != nil {
|
||||
return vNext, err
|
||||
}
|
||||
}
|
||||
vNext.metadata = metadata
|
||||
vNext.original = v.originalVPrefix() + "" + vNext.String()
|
||||
return vNext, nil
|
||||
}
|
||||
|
||||
// LessThan tests if one version is less than another one.
|
||||
func (v *Version) LessThan(o *Version) bool {
|
||||
return v.Compare(o) < 0
|
||||
}
|
||||
|
||||
// GreaterThan tests if one version is greater than another one.
|
||||
func (v *Version) GreaterThan(o *Version) bool {
|
||||
return v.Compare(o) > 0
|
||||
}
|
||||
|
||||
// Equal tests if two versions are equal to each other.
|
||||
// Note, versions can be equal with different metadata since metadata
|
||||
// is not considered part of the comparable version.
|
||||
func (v *Version) Equal(o *Version) bool {
|
||||
return v.Compare(o) == 0
|
||||
}
|
||||
|
||||
// Compare compares this version to another one. It returns -1, 0, or 1 if
|
||||
// the version smaller, equal, or larger than the other version.
|
||||
//
|
||||
// Versions are compared by X.Y.Z. Build metadata is ignored. Prerelease is
|
||||
// lower than the version without a prerelease. Compare always takes into account
|
||||
// prereleases. If you want to work with ranges using typical range syntaxes that
|
||||
// skip prereleases if the range is not looking for them use constraints.
|
||||
func (v *Version) Compare(o *Version) int {
|
||||
// Compare the major, minor, and patch version for differences. If a
|
||||
// difference is found return the comparison.
|
||||
if d := compareSegment(v.Major(), o.Major()); d != 0 {
|
||||
return d
|
||||
}
|
||||
if d := compareSegment(v.Minor(), o.Minor()); d != 0 {
|
||||
return d
|
||||
}
|
||||
if d := compareSegment(v.Patch(), o.Patch()); d != 0 {
|
||||
return d
|
||||
}
|
||||
|
||||
// At this point the major, minor, and patch versions are the same.
|
||||
ps := v.pre
|
||||
po := o.Prerelease()
|
||||
|
||||
if ps == "" && po == "" {
|
||||
return 0
|
||||
}
|
||||
if ps == "" {
|
||||
return 1
|
||||
}
|
||||
if po == "" {
|
||||
return -1
|
||||
}
|
||||
|
||||
return comparePrerelease(ps, po)
|
||||
}
|
||||
|
||||
// UnmarshalJSON implements JSON.Unmarshaler interface.
|
||||
func (v *Version) UnmarshalJSON(b []byte) error {
|
||||
var s string
|
||||
if err := json.Unmarshal(b, &s); err != nil {
|
||||
return err
|
||||
}
|
||||
temp, err := NewVersion(s)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
v.major = temp.major
|
||||
v.minor = temp.minor
|
||||
v.patch = temp.patch
|
||||
v.pre = temp.pre
|
||||
v.metadata = temp.metadata
|
||||
v.original = temp.original
|
||||
return nil
|
||||
}
|
||||
|
||||
// MarshalJSON implements JSON.Marshaler interface.
|
||||
func (v Version) MarshalJSON() ([]byte, error) {
|
||||
return json.Marshal(v.String())
|
||||
}
|
||||
|
||||
// UnmarshalText implements the encoding.TextUnmarshaler interface.
|
||||
func (v *Version) UnmarshalText(text []byte) error {
|
||||
temp, err := NewVersion(string(text))
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
*v = *temp
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
// MarshalText implements the encoding.TextMarshaler interface.
|
||||
func (v Version) MarshalText() ([]byte, error) {
|
||||
return []byte(v.String()), nil
|
||||
}
|
||||
|
||||
// Scan implements the SQL.Scanner interface.
|
||||
func (v *Version) Scan(value interface{}) error {
|
||||
var s string
|
||||
s, _ = value.(string)
|
||||
temp, err := NewVersion(s)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
v.major = temp.major
|
||||
v.minor = temp.minor
|
||||
v.patch = temp.patch
|
||||
v.pre = temp.pre
|
||||
v.metadata = temp.metadata
|
||||
v.original = temp.original
|
||||
return nil
|
||||
}
|
||||
|
||||
// Value implements the Driver.Valuer interface.
|
||||
func (v Version) Value() (driver.Value, error) {
|
||||
return v.String(), nil
|
||||
}
|
||||
|
||||
func compareSegment(v, o uint64) int {
|
||||
if v < o {
|
||||
return -1
|
||||
}
|
||||
if v > o {
|
||||
return 1
|
||||
}
|
||||
|
||||
return 0
|
||||
}
|
||||
|
||||
func comparePrerelease(v, o string) int {
|
||||
// split the prelease versions by their part. The separator, per the spec,
|
||||
// is a .
|
||||
sparts := strings.Split(v, ".")
|
||||
oparts := strings.Split(o, ".")
|
||||
|
||||
// Find the longer length of the parts to know how many loop iterations to
|
||||
// go through.
|
||||
slen := len(sparts)
|
||||
olen := len(oparts)
|
||||
|
||||
l := slen
|
||||
if olen > slen {
|
||||
l = olen
|
||||
}
|
||||
|
||||
// Iterate over each part of the prereleases to compare the differences.
|
||||
for i := 0; i < l; i++ {
|
||||
// Since the lentgh of the parts can be different we need to create
|
||||
// a placeholder. This is to avoid out of bounds issues.
|
||||
stemp := ""
|
||||
if i < slen {
|
||||
stemp = sparts[i]
|
||||
}
|
||||
|
||||
otemp := ""
|
||||
if i < olen {
|
||||
otemp = oparts[i]
|
||||
}
|
||||
|
||||
d := comparePrePart(stemp, otemp)
|
||||
if d != 0 {
|
||||
return d
|
||||
}
|
||||
}
|
||||
|
||||
// Reaching here means two versions are of equal value but have different
|
||||
// metadata (the part following a +). They are not identical in string form
|
||||
// but the version comparison finds them to be equal.
|
||||
return 0
|
||||
}
|
||||
|
||||
func comparePrePart(s, o string) int {
|
||||
// Fastpath if they are equal
|
||||
if s == o {
|
||||
return 0
|
||||
}
|
||||
|
||||
// When s or o are empty we can use the other in an attempt to determine
|
||||
// the response.
|
||||
if s == "" {
|
||||
if o != "" {
|
||||
return -1
|
||||
}
|
||||
return 1
|
||||
}
|
||||
|
||||
if o == "" {
|
||||
if s != "" {
|
||||
return 1
|
||||
}
|
||||
return -1
|
||||
}
|
||||
|
||||
// When comparing strings "99" is greater than "103". To handle
|
||||
// cases like this we need to detect numbers and compare them. According
|
||||
// to the semver spec, numbers are always positive. If there is a - at the
|
||||
// start like -99 this is to be evaluated as an alphanum. numbers always
|
||||
// have precedence over alphanum. Parsing as Uints because negative numbers
|
||||
// are ignored.
|
||||
|
||||
oi, n1 := strconv.ParseUint(o, 10, 64)
|
||||
si, n2 := strconv.ParseUint(s, 10, 64)
|
||||
|
||||
// The case where both are strings compare the strings
|
||||
if n1 != nil && n2 != nil {
|
||||
if s > o {
|
||||
return 1
|
||||
}
|
||||
return -1
|
||||
} else if n1 != nil {
|
||||
// o is a string and s is a number
|
||||
return -1
|
||||
} else if n2 != nil {
|
||||
// s is a string and o is a number
|
||||
return 1
|
||||
}
|
||||
// Both are numbers
|
||||
if si > oi {
|
||||
return 1
|
||||
}
|
||||
return -1
|
||||
}
|
||||
|
||||
// Like strings.ContainsAny but does an only instead of any.
|
||||
func containsOnly(s string, comp string) bool {
|
||||
return strings.IndexFunc(s, func(r rune) bool {
|
||||
return !strings.ContainsRune(comp, r)
|
||||
}) == -1
|
||||
}
|
||||
|
||||
// From the spec, "Identifiers MUST comprise only
|
||||
// ASCII alphanumerics and hyphen [0-9A-Za-z-]. Identifiers MUST NOT be empty.
|
||||
// Numeric identifiers MUST NOT include leading zeroes.". These segments can
|
||||
// be dot separated.
|
||||
func validatePrerelease(p string) error {
|
||||
eparts := strings.Split(p, ".")
|
||||
for _, p := range eparts {
|
||||
if containsOnly(p, num) {
|
||||
if len(p) > 1 && p[0] == '0' {
|
||||
return ErrSegmentStartsZero
|
||||
}
|
||||
} else if !containsOnly(p, allowed) {
|
||||
return ErrInvalidPrerelease
|
||||
}
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
// From the spec, "Build metadata MAY be denoted by
|
||||
// appending a plus sign and a series of dot separated identifiers immediately
|
||||
// following the patch or pre-release version. Identifiers MUST comprise only
|
||||
// ASCII alphanumerics and hyphen [0-9A-Za-z-]. Identifiers MUST NOT be empty."
|
||||
func validateMetadata(m string) error {
|
||||
eparts := strings.Split(m, ".")
|
||||
for _, p := range eparts {
|
||||
if !containsOnly(p, allowed) {
|
||||
return ErrInvalidMetadata
|
||||
}
|
||||
}
|
||||
return nil
|
||||
}
|
||||
2
vendor/github.com/Masterminds/sprig/v3/.gitignore
generated
vendored
Normal file
2
vendor/github.com/Masterminds/sprig/v3/.gitignore
generated
vendored
Normal file
|
|
@ -0,0 +1,2 @@
|
|||
vendor/
|
||||
/.glide
|
||||
383
vendor/github.com/Masterminds/sprig/v3/CHANGELOG.md
generated
vendored
Normal file
383
vendor/github.com/Masterminds/sprig/v3/CHANGELOG.md
generated
vendored
Normal file
|
|
@ -0,0 +1,383 @@
|
|||
# Changelog
|
||||
|
||||
## Release 3.2.3 (2022-11-29)
|
||||
|
||||
### Changed
|
||||
|
||||
- Updated docs (thanks @book987 @aJetHorn @neelayu @pellizzetti @apricote @SaigyoujiYuyuko233 @AlekSi)
|
||||
- #348: Updated huandu/xstrings which fixed a snake case bug (thanks @yxxhero)
|
||||
- #353: Updated masterminds/semver which included bug fixes
|
||||
- #354: Updated golang.org/x/crypto which included bug fixes
|
||||
|
||||
## Release 3.2.2 (2021-02-04)
|
||||
|
||||
This is a re-release of 3.2.1 to satisfy something with the Go module system.
|
||||
|
||||
## Release 3.2.1 (2021-02-04)
|
||||
|
||||
### Changed
|
||||
|
||||
- Upgraded `Masterminds/goutils` to `v1.1.1`. see the [Security Advisory](https://github.com/Masterminds/goutils/security/advisories/GHSA-xg2h-wx96-xgxr)
|
||||
|
||||
## Release 3.2.0 (2020-12-14)
|
||||
|
||||
### Added
|
||||
|
||||
- #211: Added randInt function (thanks @kochurovro)
|
||||
- #223: Added fromJson and mustFromJson functions (thanks @mholt)
|
||||
- #242: Added a bcrypt function (thanks @robbiet480)
|
||||
- #253: Added randBytes function (thanks @MikaelSmith)
|
||||
- #254: Added dig function for dicts (thanks @nyarly)
|
||||
- #257: Added regexQuoteMeta for quoting regex metadata (thanks @rheaton)
|
||||
- #261: Added filepath functions osBase, osDir, osExt, osClean, osIsAbs (thanks @zugl)
|
||||
- #268: Added and and all functions for testing conditions (thanks @phuslu)
|
||||
- #181: Added float64 arithmetic addf, add1f, subf, divf, mulf, maxf, and minf
|
||||
(thanks @andrewmostello)
|
||||
- #265: Added chunk function to split array into smaller arrays (thanks @karelbilek)
|
||||
- #270: Extend certificate functions to handle non-RSA keys + add support for
|
||||
ed25519 keys (thanks @misberner)
|
||||
|
||||
### Changed
|
||||
|
||||
- Removed testing and support for Go 1.12. ed25519 support requires Go 1.13 or newer
|
||||
- Using semver 3.1.1 and mergo 0.3.11
|
||||
|
||||
### Fixed
|
||||
|
||||
- #249: Fix htmlDateInZone example (thanks @spawnia)
|
||||
|
||||
NOTE: The dependency github.com/imdario/mergo reverted the breaking change in
|
||||
0.3.9 via 0.3.10 release.
|
||||
|
||||
## Release 3.1.0 (2020-04-16)
|
||||
|
||||
NOTE: The dependency github.com/imdario/mergo made a behavior change in 0.3.9
|
||||
that impacts sprig functionality. Do not use sprig with a version newer than 0.3.8.
|
||||
|
||||
### Added
|
||||
|
||||
- #225: Added support for generating htpasswd hash (thanks @rustycl0ck)
|
||||
- #224: Added duration filter (thanks @frebib)
|
||||
- #205: Added `seq` function (thanks @thadc23)
|
||||
|
||||
### Changed
|
||||
|
||||
- #203: Unlambda functions with correct signature (thanks @muesli)
|
||||
- #236: Updated the license formatting for GitHub display purposes
|
||||
- #238: Updated package dependency versions. Note, mergo not updated to 0.3.9
|
||||
as it causes a breaking change for sprig. That issue is tracked at
|
||||
https://github.com/imdario/mergo/issues/139
|
||||
|
||||
### Fixed
|
||||
|
||||
- #229: Fix `seq` example in docs (thanks @kalmant)
|
||||
|
||||
## Release 3.0.2 (2019-12-13)
|
||||
|
||||
### Fixed
|
||||
|
||||
- #220: Updating to semver v3.0.3 to fix issue with <= ranges
|
||||
- #218: fix typo elyptical->elliptic in ecdsa key description (thanks @laverya)
|
||||
|
||||
## Release 3.0.1 (2019-12-08)
|
||||
|
||||
### Fixed
|
||||
|
||||
- #212: Updated semver fixing broken constraint checking with ^0.0
|
||||
|
||||
## Release 3.0.0 (2019-10-02)
|
||||
|
||||
### Added
|
||||
|
||||
- #187: Added durationRound function (thanks @yjp20)
|
||||
- #189: Added numerous template functions that return errors rather than panic (thanks @nrvnrvn)
|
||||
- #193: Added toRawJson support (thanks @Dean-Coakley)
|
||||
- #197: Added get support to dicts (thanks @Dean-Coakley)
|
||||
|
||||
### Changed
|
||||
|
||||
- #186: Moving dependency management to Go modules
|
||||
- #186: Updated semver to v3. This has changes in the way ^ is handled
|
||||
- #194: Updated documentation on merging and how it copies. Added example using deepCopy
|
||||
- #196: trunc now supports negative values (thanks @Dean-Coakley)
|
||||
|
||||
## Release 2.22.0 (2019-10-02)
|
||||
|
||||
### Added
|
||||
|
||||
- #173: Added getHostByName function to resolve dns names to ips (thanks @fcgravalos)
|
||||
- #195: Added deepCopy function for use with dicts
|
||||
|
||||
### Changed
|
||||
|
||||
- Updated merge and mergeOverwrite documentation to explain copying and how to
|
||||
use deepCopy with it
|
||||
|
||||
## Release 2.21.0 (2019-09-18)
|
||||
|
||||
### Added
|
||||
|
||||
- #122: Added encryptAES/decryptAES functions (thanks @n0madic)
|
||||
- #128: Added toDecimal support (thanks @Dean-Coakley)
|
||||
- #169: Added list contcat (thanks @astorath)
|
||||
- #174: Added deepEqual function (thanks @bonifaido)
|
||||
- #170: Added url parse and join functions (thanks @astorath)
|
||||
|
||||
### Changed
|
||||
|
||||
- #171: Updated glide config for Google UUID to v1 and to add ranges to semver and testify
|
||||
|
||||
### Fixed
|
||||
|
||||
- #172: Fix semver wildcard example (thanks @piepmatz)
|
||||
- #175: Fix dateInZone doc example (thanks @s3than)
|
||||
|
||||
## Release 2.20.0 (2019-06-18)
|
||||
|
||||
### Added
|
||||
|
||||
- #164: Adding function to get unix epoch for a time (@mattfarina)
|
||||
- #166: Adding tests for date_in_zone (@mattfarina)
|
||||
|
||||
### Changed
|
||||
|
||||
- #144: Fix function comments based on best practices from Effective Go (@CodeLingoTeam)
|
||||
- #150: Handles pointer type for time.Time in "htmlDate" (@mapreal19)
|
||||
- #161, #157, #160, #153, #158, #156, #155, #159, #152 documentation updates (@badeadan)
|
||||
|
||||
### Fixed
|
||||
|
||||
## Release 2.19.0 (2019-03-02)
|
||||
|
||||
IMPORTANT: This release reverts a change from 2.18.0
|
||||
|
||||
In the previous release (2.18), we prematurely merged a partial change to the crypto functions that led to creating two sets of crypto functions (I blame @technosophos -- since that's me). This release rolls back that change, and does what was originally intended: It alters the existing crypto functions to use secure random.
|
||||
|
||||
We debated whether this classifies as a change worthy of major revision, but given the proximity to the last release, we have decided that treating 2.18 as a faulty release is the correct course of action. We apologize for any inconvenience.
|
||||
|
||||
### Changed
|
||||
|
||||
- Fix substr panic 35fb796 (Alexey igrychev)
|
||||
- Remove extra period 1eb7729 (Matthew Lorimor)
|
||||
- Make random string functions use crypto by default 6ceff26 (Matthew Lorimor)
|
||||
- README edits/fixes/suggestions 08fe136 (Lauri Apple)
|
||||
|
||||
|
||||
## Release 2.18.0 (2019-02-12)
|
||||
|
||||
### Added
|
||||
|
||||
- Added mergeOverwrite function
|
||||
- cryptographic functions that use secure random (see fe1de12)
|
||||
|
||||
### Changed
|
||||
|
||||
- Improve documentation of regexMatch function, resolves #139 90b89ce (Jan Tagscherer)
|
||||
- Handle has for nil list 9c10885 (Daniel Cohen)
|
||||
- Document behaviour of mergeOverwrite fe0dbe9 (Lukas Rieder)
|
||||
- doc: adds missing documentation. 4b871e6 (Fernandez Ludovic)
|
||||
- Replace outdated goutils imports 01893d2 (Matthew Lorimor)
|
||||
- Surface crypto secure random strings from goutils fe1de12 (Matthew Lorimor)
|
||||
- Handle untyped nil values as paramters to string functions 2b2ec8f (Morten Torkildsen)
|
||||
|
||||
### Fixed
|
||||
|
||||
- Fix dict merge issue and provide mergeOverwrite .dst .src1 to overwrite from src -> dst 4c59c12 (Lukas Rieder)
|
||||
- Fix substr var names and comments d581f80 (Dean Coakley)
|
||||
- Fix substr documentation 2737203 (Dean Coakley)
|
||||
|
||||
## Release 2.17.1 (2019-01-03)
|
||||
|
||||
### Fixed
|
||||
|
||||
The 2.17.0 release did not have a version pinned for xstrings, which caused compilation failures when xstrings < 1.2 was used. This adds the correct version string to glide.yaml.
|
||||
|
||||
## Release 2.17.0 (2019-01-03)
|
||||
|
||||
### Added
|
||||
|
||||
- adds alder32sum function and test 6908fc2 (marshallford)
|
||||
- Added kebabcase function ca331a1 (Ilyes512)
|
||||
|
||||
### Changed
|
||||
|
||||
- Update goutils to 1.1.0 4e1125d (Matt Butcher)
|
||||
|
||||
### Fixed
|
||||
|
||||
- Fix 'has' documentation e3f2a85 (dean-coakley)
|
||||
- docs(dict): fix typo in pick example dc424f9 (Dustin Specker)
|
||||
- fixes spelling errors... not sure how that happened 4cf188a (marshallford)
|
||||
|
||||
## Release 2.16.0 (2018-08-13)
|
||||
|
||||
### Added
|
||||
|
||||
- add splitn function fccb0b0 (Helgi Þorbjörnsson)
|
||||
- Add slice func df28ca7 (gongdo)
|
||||
- Generate serial number a3bdffd (Cody Coons)
|
||||
- Extract values of dict with values function df39312 (Lawrence Jones)
|
||||
|
||||
### Changed
|
||||
|
||||
- Modify panic message for list.slice ae38335 (gongdo)
|
||||
- Minor improvement in code quality - Removed an unreachable piece of code at defaults.go#L26:6 - Resolve formatting issues. 5834241 (Abhishek Kashyap)
|
||||
- Remove duplicated documentation 1d97af1 (Matthew Fisher)
|
||||
- Test on go 1.11 49df809 (Helgi Þormar Þorbjörnsson)
|
||||
|
||||
### Fixed
|
||||
|
||||
- Fix file permissions c5f40b5 (gongdo)
|
||||
- Fix example for buildCustomCert 7779e0d (Tin Lam)
|
||||
|
||||
## Release 2.15.0 (2018-04-02)
|
||||
|
||||
### Added
|
||||
|
||||
- #68 and #69: Add json helpers to docs (thanks @arunvelsriram)
|
||||
- #66: Add ternary function (thanks @binoculars)
|
||||
- #67: Allow keys function to take multiple dicts (thanks @binoculars)
|
||||
- #89: Added sha1sum to crypto function (thanks @benkeil)
|
||||
- #81: Allow customizing Root CA that used by genSignedCert (thanks @chenzhiwei)
|
||||
- #92: Add travis testing for go 1.10
|
||||
- #93: Adding appveyor config for windows testing
|
||||
|
||||
### Changed
|
||||
|
||||
- #90: Updating to more recent dependencies
|
||||
- #73: replace satori/go.uuid with google/uuid (thanks @petterw)
|
||||
|
||||
### Fixed
|
||||
|
||||
- #76: Fixed documentation typos (thanks @Thiht)
|
||||
- Fixed rounding issue on the `ago` function. Note, the removes support for Go 1.8 and older
|
||||
|
||||
## Release 2.14.1 (2017-12-01)
|
||||
|
||||
### Fixed
|
||||
|
||||
- #60: Fix typo in function name documentation (thanks @neil-ca-moore)
|
||||
- #61: Removing line with {{ due to blocking github pages genertion
|
||||
- #64: Update the list functions to handle int, string, and other slices for compatibility
|
||||
|
||||
## Release 2.14.0 (2017-10-06)
|
||||
|
||||
This new version of Sprig adds a set of functions for generating and working with SSL certificates.
|
||||
|
||||
- `genCA` generates an SSL Certificate Authority
|
||||
- `genSelfSignedCert` generates an SSL self-signed certificate
|
||||
- `genSignedCert` generates an SSL certificate and key based on a given CA
|
||||
|
||||
## Release 2.13.0 (2017-09-18)
|
||||
|
||||
This release adds new functions, including:
|
||||
|
||||
- `regexMatch`, `regexFindAll`, `regexFind`, `regexReplaceAll`, `regexReplaceAllLiteral`, and `regexSplit` to work with regular expressions
|
||||
- `floor`, `ceil`, and `round` math functions
|
||||
- `toDate` converts a string to a date
|
||||
- `nindent` is just like `indent` but also prepends a new line
|
||||
- `ago` returns the time from `time.Now`
|
||||
|
||||
### Added
|
||||
|
||||
- #40: Added basic regex functionality (thanks @alanquillin)
|
||||
- #41: Added ceil floor and round functions (thanks @alanquillin)
|
||||
- #48: Added toDate function (thanks @andreynering)
|
||||
- #50: Added nindent function (thanks @binoculars)
|
||||
- #46: Added ago function (thanks @slayer)
|
||||
|
||||
### Changed
|
||||
|
||||
- #51: Updated godocs to include new string functions (thanks @curtisallen)
|
||||
- #49: Added ability to merge multiple dicts (thanks @binoculars)
|
||||
|
||||
## Release 2.12.0 (2017-05-17)
|
||||
|
||||
- `snakecase`, `camelcase`, and `shuffle` are three new string functions
|
||||
- `fail` allows you to bail out of a template render when conditions are not met
|
||||
|
||||
## Release 2.11.0 (2017-05-02)
|
||||
|
||||
- Added `toJson` and `toPrettyJson`
|
||||
- Added `merge`
|
||||
- Refactored documentation
|
||||
|
||||
## Release 2.10.0 (2017-03-15)
|
||||
|
||||
- Added `semver` and `semverCompare` for Semantic Versions
|
||||
- `list` replaces `tuple`
|
||||
- Fixed issue with `join`
|
||||
- Added `first`, `last`, `intial`, `rest`, `prepend`, `append`, `toString`, `toStrings`, `sortAlpha`, `reverse`, `coalesce`, `pluck`, `pick`, `compact`, `keys`, `omit`, `uniq`, `has`, `without`
|
||||
|
||||
## Release 2.9.0 (2017-02-23)
|
||||
|
||||
- Added `splitList` to split a list
|
||||
- Added crypto functions of `genPrivateKey` and `derivePassword`
|
||||
|
||||
## Release 2.8.0 (2016-12-21)
|
||||
|
||||
- Added access to several path functions (`base`, `dir`, `clean`, `ext`, and `abs`)
|
||||
- Added functions for _mutating_ dictionaries (`set`, `unset`, `hasKey`)
|
||||
|
||||
## Release 2.7.0 (2016-12-01)
|
||||
|
||||
- Added `sha256sum` to generate a hash of an input
|
||||
- Added functions to convert a numeric or string to `int`, `int64`, `float64`
|
||||
|
||||
## Release 2.6.0 (2016-10-03)
|
||||
|
||||
- Added a `uuidv4` template function for generating UUIDs inside of a template.
|
||||
|
||||
## Release 2.5.0 (2016-08-19)
|
||||
|
||||
- New `trimSuffix`, `trimPrefix`, `hasSuffix`, and `hasPrefix` functions
|
||||
- New aliases have been added for a few functions that didn't follow the naming conventions (`trimAll` and `abbrevBoth`)
|
||||
- `trimall` and `abbrevboth` (notice the case) are deprecated and will be removed in 3.0.0
|
||||
|
||||
## Release 2.4.0 (2016-08-16)
|
||||
|
||||
- Adds two functions: `until` and `untilStep`
|
||||
|
||||
## Release 2.3.0 (2016-06-21)
|
||||
|
||||
- cat: Concatenate strings with whitespace separators.
|
||||
- replace: Replace parts of a string: `replace " " "-" "Me First"` renders "Me-First"
|
||||
- plural: Format plurals: `len "foo" | plural "one foo" "many foos"` renders "many foos"
|
||||
- indent: Indent blocks of text in a way that is sensitive to "\n" characters.
|
||||
|
||||
## Release 2.2.0 (2016-04-21)
|
||||
|
||||
- Added a `genPrivateKey` function (Thanks @bacongobbler)
|
||||
|
||||
## Release 2.1.0 (2016-03-30)
|
||||
|
||||
- `default` now prints the default value when it does not receive a value down the pipeline. It is much safer now to do `{{.Foo | default "bar"}}`.
|
||||
- Added accessors for "hermetic" functions. These return only functions that, when given the same input, produce the same output.
|
||||
|
||||
## Release 2.0.0 (2016-03-29)
|
||||
|
||||
Because we switched from `int` to `int64` as the return value for all integer math functions, the library's major version number has been incremented.
|
||||
|
||||
- `min` complements `max` (formerly `biggest`)
|
||||
- `empty` indicates that a value is the empty value for its type
|
||||
- `tuple` creates a tuple inside of a template: `{{$t := tuple "a", "b" "c"}}`
|
||||
- `dict` creates a dictionary inside of a template `{{$d := dict "key1" "val1" "key2" "val2"}}`
|
||||
- Date formatters have been added for HTML dates (as used in `date` input fields)
|
||||
- Integer math functions can convert from a number of types, including `string` (via `strconv.ParseInt`).
|
||||
|
||||
## Release 1.2.0 (2016-02-01)
|
||||
|
||||
- Added quote and squote
|
||||
- Added b32enc and b32dec
|
||||
- add now takes varargs
|
||||
- biggest now takes varargs
|
||||
|
||||
## Release 1.1.0 (2015-12-29)
|
||||
|
||||
- Added #4: Added contains function. strings.Contains, but with the arguments
|
||||
switched to simplify common pipelines. (thanks krancour)
|
||||
- Added Travis-CI testing support
|
||||
|
||||
## Release 1.0.0 (2015-12-23)
|
||||
|
||||
- Initial release
|
||||
19
vendor/github.com/Masterminds/sprig/v3/LICENSE.txt
generated
vendored
Normal file
19
vendor/github.com/Masterminds/sprig/v3/LICENSE.txt
generated
vendored
Normal file
|
|
@ -0,0 +1,19 @@
|
|||
Copyright (C) 2013-2020 Masterminds
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in
|
||||
all copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||
THE SOFTWARE.
|
||||
9
vendor/github.com/Masterminds/sprig/v3/Makefile
generated
vendored
Normal file
9
vendor/github.com/Masterminds/sprig/v3/Makefile
generated
vendored
Normal file
|
|
@ -0,0 +1,9 @@
|
|||
.PHONY: test
|
||||
test:
|
||||
@echo "==> Running tests"
|
||||
GO111MODULE=on go test -v
|
||||
|
||||
.PHONY: test-cover
|
||||
test-cover:
|
||||
@echo "==> Running Tests with coverage"
|
||||
GO111MODULE=on go test -cover .
|
||||
100
vendor/github.com/Masterminds/sprig/v3/README.md
generated
vendored
Normal file
100
vendor/github.com/Masterminds/sprig/v3/README.md
generated
vendored
Normal file
|
|
@ -0,0 +1,100 @@
|
|||
# Sprig: Template functions for Go templates
|
||||
|
||||
[](https://pkg.go.dev/github.com/Masterminds/sprig/v3)
|
||||
[](https://goreportcard.com/report/github.com/Masterminds/sprig)
|
||||
[](https://masterminds.github.io/stability/sustained.html)
|
||||
[](https://github.com/Masterminds/sprig/actions)
|
||||
|
||||
The Go language comes with a [built-in template
|
||||
language](http://golang.org/pkg/text/template/), but not
|
||||
very many template functions. Sprig is a library that provides more than 100 commonly
|
||||
used template functions.
|
||||
|
||||
It is inspired by the template functions found in
|
||||
[Twig](http://twig.sensiolabs.org/documentation) and in various
|
||||
JavaScript libraries, such as [underscore.js](http://underscorejs.org/).
|
||||
|
||||
## IMPORTANT NOTES
|
||||
|
||||
Sprig leverages [mergo](https://github.com/imdario/mergo) to handle merges. In
|
||||
its v0.3.9 release, there was a behavior change that impacts merging template
|
||||
functions in sprig. It is currently recommended to use v0.3.10 or later of that package.
|
||||
Using v0.3.9 will cause sprig tests to fail.
|
||||
|
||||
## Package Versions
|
||||
|
||||
There are two active major versions of the `sprig` package.
|
||||
|
||||
* v3 is currently stable release series on the `master` branch. The Go API should
|
||||
remain compatible with v2, the current stable version. Behavior change behind
|
||||
some functions is the reason for the new major version.
|
||||
* v2 is the previous stable release series. It has been more than three years since
|
||||
the initial release of v2. You can read the documentation and see the code
|
||||
on the [release-2](https://github.com/Masterminds/sprig/tree/release-2) branch.
|
||||
Bug fixes to this major version will continue for some time.
|
||||
|
||||
## Usage
|
||||
|
||||
**Template developers**: Please use Sprig's [function documentation](http://masterminds.github.io/sprig/) for
|
||||
detailed instructions and code snippets for the >100 template functions available.
|
||||
|
||||
**Go developers**: If you'd like to include Sprig as a library in your program,
|
||||
our API documentation is available [at GoDoc.org](http://godoc.org/github.com/Masterminds/sprig).
|
||||
|
||||
For standard usage, read on.
|
||||
|
||||
### Load the Sprig library
|
||||
|
||||
To load the Sprig `FuncMap`:
|
||||
|
||||
```go
|
||||
|
||||
import (
|
||||
"github.com/Masterminds/sprig/v3"
|
||||
"html/template"
|
||||
)
|
||||
|
||||
// This example illustrates that the FuncMap *must* be set before the
|
||||
// templates themselves are loaded.
|
||||
tpl := template.Must(
|
||||
template.New("base").Funcs(sprig.FuncMap()).ParseGlob("*.html")
|
||||
)
|
||||
|
||||
|
||||
```
|
||||
|
||||
### Calling the functions inside of templates
|
||||
|
||||
By convention, all functions are lowercase. This seems to follow the Go
|
||||
idiom for template functions (as opposed to template methods, which are
|
||||
TitleCase). For example, this:
|
||||
|
||||
```
|
||||
{{ "hello!" | upper | repeat 5 }}
|
||||
```
|
||||
|
||||
produces this:
|
||||
|
||||
```
|
||||
HELLO!HELLO!HELLO!HELLO!HELLO!
|
||||
```
|
||||
|
||||
## Principles Driving Our Function Selection
|
||||
|
||||
We followed these principles to decide which functions to add and how to implement them:
|
||||
|
||||
- Use template functions to build layout. The following
|
||||
types of operations are within the domain of template functions:
|
||||
- Formatting
|
||||
- Layout
|
||||
- Simple type conversions
|
||||
- Utilities that assist in handling common formatting and layout needs (e.g. arithmetic)
|
||||
- Template functions should not return errors unless there is no way to print
|
||||
a sensible value. For example, converting a string to an integer should not
|
||||
produce an error if conversion fails. Instead, it should display a default
|
||||
value.
|
||||
- Simple math is necessary for grid layouts, pagers, and so on. Complex math
|
||||
(anything other than arithmetic) should be done outside of templates.
|
||||
- Template functions only deal with the data passed into them. They never retrieve
|
||||
data from a source.
|
||||
- Finally, do not override core Go template functions.
|
||||
653
vendor/github.com/Masterminds/sprig/v3/crypto.go
generated
vendored
Normal file
653
vendor/github.com/Masterminds/sprig/v3/crypto.go
generated
vendored
Normal file
|
|
@ -0,0 +1,653 @@
|
|||
package sprig
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"crypto"
|
||||
"crypto/aes"
|
||||
"crypto/cipher"
|
||||
"crypto/dsa"
|
||||
"crypto/ecdsa"
|
||||
"crypto/ed25519"
|
||||
"crypto/elliptic"
|
||||
"crypto/hmac"
|
||||
"crypto/rand"
|
||||
"crypto/rsa"
|
||||
"crypto/sha1"
|
||||
"crypto/sha256"
|
||||
"crypto/x509"
|
||||
"crypto/x509/pkix"
|
||||
"encoding/asn1"
|
||||
"encoding/base64"
|
||||
"encoding/binary"
|
||||
"encoding/hex"
|
||||
"encoding/pem"
|
||||
"errors"
|
||||
"fmt"
|
||||
"hash/adler32"
|
||||
"io"
|
||||
"math/big"
|
||||
"net"
|
||||
"time"
|
||||
|
||||
"strings"
|
||||
|
||||
"github.com/google/uuid"
|
||||
bcrypt_lib "golang.org/x/crypto/bcrypt"
|
||||
"golang.org/x/crypto/scrypt"
|
||||
)
|
||||
|
||||
func sha256sum(input string) string {
|
||||
hash := sha256.Sum256([]byte(input))
|
||||
return hex.EncodeToString(hash[:])
|
||||
}
|
||||
|
||||
func sha1sum(input string) string {
|
||||
hash := sha1.Sum([]byte(input))
|
||||
return hex.EncodeToString(hash[:])
|
||||
}
|
||||
|
||||
func adler32sum(input string) string {
|
||||
hash := adler32.Checksum([]byte(input))
|
||||
return fmt.Sprintf("%d", hash)
|
||||
}
|
||||
|
||||
func bcrypt(input string) string {
|
||||
hash, err := bcrypt_lib.GenerateFromPassword([]byte(input), bcrypt_lib.DefaultCost)
|
||||
if err != nil {
|
||||
return fmt.Sprintf("failed to encrypt string with bcrypt: %s", err)
|
||||
}
|
||||
|
||||
return string(hash)
|
||||
}
|
||||
|
||||
func htpasswd(username string, password string) string {
|
||||
if strings.Contains(username, ":") {
|
||||
return fmt.Sprintf("invalid username: %s", username)
|
||||
}
|
||||
return fmt.Sprintf("%s:%s", username, bcrypt(password))
|
||||
}
|
||||
|
||||
func randBytes(count int) (string, error) {
|
||||
buf := make([]byte, count)
|
||||
if _, err := rand.Read(buf); err != nil {
|
||||
return "", err
|
||||
}
|
||||
return base64.StdEncoding.EncodeToString(buf), nil
|
||||
}
|
||||
|
||||
// uuidv4 provides a safe and secure UUID v4 implementation
|
||||
func uuidv4() string {
|
||||
return uuid.New().String()
|
||||
}
|
||||
|
||||
var masterPasswordSeed = "com.lyndir.masterpassword"
|
||||
|
||||
var passwordTypeTemplates = map[string][][]byte{
|
||||
"maximum": {[]byte("anoxxxxxxxxxxxxxxxxx"), []byte("axxxxxxxxxxxxxxxxxno")},
|
||||
"long": {[]byte("CvcvnoCvcvCvcv"), []byte("CvcvCvcvnoCvcv"), []byte("CvcvCvcvCvcvno"), []byte("CvccnoCvcvCvcv"), []byte("CvccCvcvnoCvcv"),
|
||||
[]byte("CvccCvcvCvcvno"), []byte("CvcvnoCvccCvcv"), []byte("CvcvCvccnoCvcv"), []byte("CvcvCvccCvcvno"), []byte("CvcvnoCvcvCvcc"),
|
||||
[]byte("CvcvCvcvnoCvcc"), []byte("CvcvCvcvCvccno"), []byte("CvccnoCvccCvcv"), []byte("CvccCvccnoCvcv"), []byte("CvccCvccCvcvno"),
|
||||
[]byte("CvcvnoCvccCvcc"), []byte("CvcvCvccnoCvcc"), []byte("CvcvCvccCvccno"), []byte("CvccnoCvcvCvcc"), []byte("CvccCvcvnoCvcc"),
|
||||
[]byte("CvccCvcvCvccno")},
|
||||
"medium": {[]byte("CvcnoCvc"), []byte("CvcCvcno")},
|
||||
"short": {[]byte("Cvcn")},
|
||||
"basic": {[]byte("aaanaaan"), []byte("aannaaan"), []byte("aaannaaa")},
|
||||
"pin": {[]byte("nnnn")},
|
||||
}
|
||||
|
||||
var templateCharacters = map[byte]string{
|
||||
'V': "AEIOU",
|
||||
'C': "BCDFGHJKLMNPQRSTVWXYZ",
|
||||
'v': "aeiou",
|
||||
'c': "bcdfghjklmnpqrstvwxyz",
|
||||
'A': "AEIOUBCDFGHJKLMNPQRSTVWXYZ",
|
||||
'a': "AEIOUaeiouBCDFGHJKLMNPQRSTVWXYZbcdfghjklmnpqrstvwxyz",
|
||||
'n': "0123456789",
|
||||
'o': "@&%?,=[]_:-+*$#!'^~;()/.",
|
||||
'x': "AEIOUaeiouBCDFGHJKLMNPQRSTVWXYZbcdfghjklmnpqrstvwxyz0123456789!@#$%^&*()",
|
||||
}
|
||||
|
||||
func derivePassword(counter uint32, passwordType, password, user, site string) string {
|
||||
var templates = passwordTypeTemplates[passwordType]
|
||||
if templates == nil {
|
||||
return fmt.Sprintf("cannot find password template %s", passwordType)
|
||||
}
|
||||
|
||||
var buffer bytes.Buffer
|
||||
buffer.WriteString(masterPasswordSeed)
|
||||
binary.Write(&buffer, binary.BigEndian, uint32(len(user)))
|
||||
buffer.WriteString(user)
|
||||
|
||||
salt := buffer.Bytes()
|
||||
key, err := scrypt.Key([]byte(password), salt, 32768, 8, 2, 64)
|
||||
if err != nil {
|
||||
return fmt.Sprintf("failed to derive password: %s", err)
|
||||
}
|
||||
|
||||
buffer.Truncate(len(masterPasswordSeed))
|
||||
binary.Write(&buffer, binary.BigEndian, uint32(len(site)))
|
||||
buffer.WriteString(site)
|
||||
binary.Write(&buffer, binary.BigEndian, counter)
|
||||
|
||||
var hmacv = hmac.New(sha256.New, key)
|
||||
hmacv.Write(buffer.Bytes())
|
||||
var seed = hmacv.Sum(nil)
|
||||
var temp = templates[int(seed[0])%len(templates)]
|
||||
|
||||
buffer.Truncate(0)
|
||||
for i, element := range temp {
|
||||
passChars := templateCharacters[element]
|
||||
passChar := passChars[int(seed[i+1])%len(passChars)]
|
||||
buffer.WriteByte(passChar)
|
||||
}
|
||||
|
||||
return buffer.String()
|
||||
}
|
||||
|
||||
func generatePrivateKey(typ string) string {
|
||||
var priv interface{}
|
||||
var err error
|
||||
switch typ {
|
||||
case "", "rsa":
|
||||
// good enough for government work
|
||||
priv, err = rsa.GenerateKey(rand.Reader, 4096)
|
||||
case "dsa":
|
||||
key := new(dsa.PrivateKey)
|
||||
// again, good enough for government work
|
||||
if err = dsa.GenerateParameters(&key.Parameters, rand.Reader, dsa.L2048N256); err != nil {
|
||||
return fmt.Sprintf("failed to generate dsa params: %s", err)
|
||||
}
|
||||
err = dsa.GenerateKey(key, rand.Reader)
|
||||
priv = key
|
||||
case "ecdsa":
|
||||
// again, good enough for government work
|
||||
priv, err = ecdsa.GenerateKey(elliptic.P256(), rand.Reader)
|
||||
case "ed25519":
|
||||
_, priv, err = ed25519.GenerateKey(rand.Reader)
|
||||
default:
|
||||
return "Unknown type " + typ
|
||||
}
|
||||
if err != nil {
|
||||
return fmt.Sprintf("failed to generate private key: %s", err)
|
||||
}
|
||||
|
||||
return string(pem.EncodeToMemory(pemBlockForKey(priv)))
|
||||
}
|
||||
|
||||
// DSAKeyFormat stores the format for DSA keys.
|
||||
// Used by pemBlockForKey
|
||||
type DSAKeyFormat struct {
|
||||
Version int
|
||||
P, Q, G, Y, X *big.Int
|
||||
}
|
||||
|
||||
func pemBlockForKey(priv interface{}) *pem.Block {
|
||||
switch k := priv.(type) {
|
||||
case *rsa.PrivateKey:
|
||||
return &pem.Block{Type: "RSA PRIVATE KEY", Bytes: x509.MarshalPKCS1PrivateKey(k)}
|
||||
case *dsa.PrivateKey:
|
||||
val := DSAKeyFormat{
|
||||
P: k.P, Q: k.Q, G: k.G,
|
||||
Y: k.Y, X: k.X,
|
||||
}
|
||||
bytes, _ := asn1.Marshal(val)
|
||||
return &pem.Block{Type: "DSA PRIVATE KEY", Bytes: bytes}
|
||||
case *ecdsa.PrivateKey:
|
||||
b, _ := x509.MarshalECPrivateKey(k)
|
||||
return &pem.Block{Type: "EC PRIVATE KEY", Bytes: b}
|
||||
default:
|
||||
// attempt PKCS#8 format for all other keys
|
||||
b, err := x509.MarshalPKCS8PrivateKey(k)
|
||||
if err != nil {
|
||||
return nil
|
||||
}
|
||||
return &pem.Block{Type: "PRIVATE KEY", Bytes: b}
|
||||
}
|
||||
}
|
||||
|
||||
func parsePrivateKeyPEM(pemBlock string) (crypto.PrivateKey, error) {
|
||||
block, _ := pem.Decode([]byte(pemBlock))
|
||||
if block == nil {
|
||||
return nil, errors.New("no PEM data in input")
|
||||
}
|
||||
|
||||
if block.Type == "PRIVATE KEY" {
|
||||
priv, err := x509.ParsePKCS8PrivateKey(block.Bytes)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("decoding PEM as PKCS#8: %s", err)
|
||||
}
|
||||
return priv, nil
|
||||
} else if !strings.HasSuffix(block.Type, " PRIVATE KEY") {
|
||||
return nil, fmt.Errorf("no private key data in PEM block of type %s", block.Type)
|
||||
}
|
||||
|
||||
switch block.Type[:len(block.Type)-12] { // strip " PRIVATE KEY"
|
||||
case "RSA":
|
||||
priv, err := x509.ParsePKCS1PrivateKey(block.Bytes)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("parsing RSA private key from PEM: %s", err)
|
||||
}
|
||||
return priv, nil
|
||||
case "EC":
|
||||
priv, err := x509.ParseECPrivateKey(block.Bytes)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("parsing EC private key from PEM: %s", err)
|
||||
}
|
||||
return priv, nil
|
||||
case "DSA":
|
||||
var k DSAKeyFormat
|
||||
_, err := asn1.Unmarshal(block.Bytes, &k)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("parsing DSA private key from PEM: %s", err)
|
||||
}
|
||||
priv := &dsa.PrivateKey{
|
||||
PublicKey: dsa.PublicKey{
|
||||
Parameters: dsa.Parameters{
|
||||
P: k.P, Q: k.Q, G: k.G,
|
||||
},
|
||||
Y: k.Y,
|
||||
},
|
||||
X: k.X,
|
||||
}
|
||||
return priv, nil
|
||||
default:
|
||||
return nil, fmt.Errorf("invalid private key type %s", block.Type)
|
||||
}
|
||||
}
|
||||
|
||||
func getPublicKey(priv crypto.PrivateKey) (crypto.PublicKey, error) {
|
||||
switch k := priv.(type) {
|
||||
case interface{ Public() crypto.PublicKey }:
|
||||
return k.Public(), nil
|
||||
case *dsa.PrivateKey:
|
||||
return &k.PublicKey, nil
|
||||
default:
|
||||
return nil, fmt.Errorf("unable to get public key for type %T", priv)
|
||||
}
|
||||
}
|
||||
|
||||
type certificate struct {
|
||||
Cert string
|
||||
Key string
|
||||
}
|
||||
|
||||
func buildCustomCertificate(b64cert string, b64key string) (certificate, error) {
|
||||
crt := certificate{}
|
||||
|
||||
cert, err := base64.StdEncoding.DecodeString(b64cert)
|
||||
if err != nil {
|
||||
return crt, errors.New("unable to decode base64 certificate")
|
||||
}
|
||||
|
||||
key, err := base64.StdEncoding.DecodeString(b64key)
|
||||
if err != nil {
|
||||
return crt, errors.New("unable to decode base64 private key")
|
||||
}
|
||||
|
||||
decodedCert, _ := pem.Decode(cert)
|
||||
if decodedCert == nil {
|
||||
return crt, errors.New("unable to decode certificate")
|
||||
}
|
||||
_, err = x509.ParseCertificate(decodedCert.Bytes)
|
||||
if err != nil {
|
||||
return crt, fmt.Errorf(
|
||||
"error parsing certificate: decodedCert.Bytes: %s",
|
||||
err,
|
||||
)
|
||||
}
|
||||
|
||||
_, err = parsePrivateKeyPEM(string(key))
|
||||
if err != nil {
|
||||
return crt, fmt.Errorf(
|
||||
"error parsing private key: %s",
|
||||
err,
|
||||
)
|
||||
}
|
||||
|
||||
crt.Cert = string(cert)
|
||||
crt.Key = string(key)
|
||||
|
||||
return crt, nil
|
||||
}
|
||||
|
||||
func generateCertificateAuthority(
|
||||
cn string,
|
||||
daysValid int,
|
||||
) (certificate, error) {
|
||||
priv, err := rsa.GenerateKey(rand.Reader, 2048)
|
||||
if err != nil {
|
||||
return certificate{}, fmt.Errorf("error generating rsa key: %s", err)
|
||||
}
|
||||
|
||||
return generateCertificateAuthorityWithKeyInternal(cn, daysValid, priv)
|
||||
}
|
||||
|
||||
func generateCertificateAuthorityWithPEMKey(
|
||||
cn string,
|
||||
daysValid int,
|
||||
privPEM string,
|
||||
) (certificate, error) {
|
||||
priv, err := parsePrivateKeyPEM(privPEM)
|
||||
if err != nil {
|
||||
return certificate{}, fmt.Errorf("parsing private key: %s", err)
|
||||
}
|
||||
return generateCertificateAuthorityWithKeyInternal(cn, daysValid, priv)
|
||||
}
|
||||
|
||||
func generateCertificateAuthorityWithKeyInternal(
|
||||
cn string,
|
||||
daysValid int,
|
||||
priv crypto.PrivateKey,
|
||||
) (certificate, error) {
|
||||
ca := certificate{}
|
||||
|
||||
template, err := getBaseCertTemplate(cn, nil, nil, daysValid)
|
||||
if err != nil {
|
||||
return ca, err
|
||||
}
|
||||
// Override KeyUsage and IsCA
|
||||
template.KeyUsage = x509.KeyUsageKeyEncipherment |
|
||||
x509.KeyUsageDigitalSignature |
|
||||
x509.KeyUsageCertSign
|
||||
template.IsCA = true
|
||||
|
||||
ca.Cert, ca.Key, err = getCertAndKey(template, priv, template, priv)
|
||||
|
||||
return ca, err
|
||||
}
|
||||
|
||||
func generateSelfSignedCertificate(
|
||||
cn string,
|
||||
ips []interface{},
|
||||
alternateDNS []interface{},
|
||||
daysValid int,
|
||||
) (certificate, error) {
|
||||
priv, err := rsa.GenerateKey(rand.Reader, 2048)
|
||||
if err != nil {
|
||||
return certificate{}, fmt.Errorf("error generating rsa key: %s", err)
|
||||
}
|
||||
return generateSelfSignedCertificateWithKeyInternal(cn, ips, alternateDNS, daysValid, priv)
|
||||
}
|
||||
|
||||
func generateSelfSignedCertificateWithPEMKey(
|
||||
cn string,
|
||||
ips []interface{},
|
||||
alternateDNS []interface{},
|
||||
daysValid int,
|
||||
privPEM string,
|
||||
) (certificate, error) {
|
||||
priv, err := parsePrivateKeyPEM(privPEM)
|
||||
if err != nil {
|
||||
return certificate{}, fmt.Errorf("parsing private key: %s", err)
|
||||
}
|
||||
return generateSelfSignedCertificateWithKeyInternal(cn, ips, alternateDNS, daysValid, priv)
|
||||
}
|
||||
|
||||
func generateSelfSignedCertificateWithKeyInternal(
|
||||
cn string,
|
||||
ips []interface{},
|
||||
alternateDNS []interface{},
|
||||
daysValid int,
|
||||
priv crypto.PrivateKey,
|
||||
) (certificate, error) {
|
||||
cert := certificate{}
|
||||
|
||||
template, err := getBaseCertTemplate(cn, ips, alternateDNS, daysValid)
|
||||
if err != nil {
|
||||
return cert, err
|
||||
}
|
||||
|
||||
cert.Cert, cert.Key, err = getCertAndKey(template, priv, template, priv)
|
||||
|
||||
return cert, err
|
||||
}
|
||||
|
||||
func generateSignedCertificate(
|
||||
cn string,
|
||||
ips []interface{},
|
||||
alternateDNS []interface{},
|
||||
daysValid int,
|
||||
ca certificate,
|
||||
) (certificate, error) {
|
||||
priv, err := rsa.GenerateKey(rand.Reader, 2048)
|
||||
if err != nil {
|
||||
return certificate{}, fmt.Errorf("error generating rsa key: %s", err)
|
||||
}
|
||||
return generateSignedCertificateWithKeyInternal(cn, ips, alternateDNS, daysValid, ca, priv)
|
||||
}
|
||||
|
||||
func generateSignedCertificateWithPEMKey(
|
||||
cn string,
|
||||
ips []interface{},
|
||||
alternateDNS []interface{},
|
||||
daysValid int,
|
||||
ca certificate,
|
||||
privPEM string,
|
||||
) (certificate, error) {
|
||||
priv, err := parsePrivateKeyPEM(privPEM)
|
||||
if err != nil {
|
||||
return certificate{}, fmt.Errorf("parsing private key: %s", err)
|
||||
}
|
||||
return generateSignedCertificateWithKeyInternal(cn, ips, alternateDNS, daysValid, ca, priv)
|
||||
}
|
||||
|
||||
func generateSignedCertificateWithKeyInternal(
|
||||
cn string,
|
||||
ips []interface{},
|
||||
alternateDNS []interface{},
|
||||
daysValid int,
|
||||
ca certificate,
|
||||
priv crypto.PrivateKey,
|
||||
) (certificate, error) {
|
||||
cert := certificate{}
|
||||
|
||||
decodedSignerCert, _ := pem.Decode([]byte(ca.Cert))
|
||||
if decodedSignerCert == nil {
|
||||
return cert, errors.New("unable to decode certificate")
|
||||
}
|
||||
signerCert, err := x509.ParseCertificate(decodedSignerCert.Bytes)
|
||||
if err != nil {
|
||||
return cert, fmt.Errorf(
|
||||
"error parsing certificate: decodedSignerCert.Bytes: %s",
|
||||
err,
|
||||
)
|
||||
}
|
||||
signerKey, err := parsePrivateKeyPEM(ca.Key)
|
||||
if err != nil {
|
||||
return cert, fmt.Errorf(
|
||||
"error parsing private key: %s",
|
||||
err,
|
||||
)
|
||||
}
|
||||
|
||||
template, err := getBaseCertTemplate(cn, ips, alternateDNS, daysValid)
|
||||
if err != nil {
|
||||
return cert, err
|
||||
}
|
||||
|
||||
cert.Cert, cert.Key, err = getCertAndKey(
|
||||
template,
|
||||
priv,
|
||||
signerCert,
|
||||
signerKey,
|
||||
)
|
||||
|
||||
return cert, err
|
||||
}
|
||||
|
||||
func getCertAndKey(
|
||||
template *x509.Certificate,
|
||||
signeeKey crypto.PrivateKey,
|
||||
parent *x509.Certificate,
|
||||
signingKey crypto.PrivateKey,
|
||||
) (string, string, error) {
|
||||
signeePubKey, err := getPublicKey(signeeKey)
|
||||
if err != nil {
|
||||
return "", "", fmt.Errorf("error retrieving public key from signee key: %s", err)
|
||||
}
|
||||
derBytes, err := x509.CreateCertificate(
|
||||
rand.Reader,
|
||||
template,
|
||||
parent,
|
||||
signeePubKey,
|
||||
signingKey,
|
||||
)
|
||||
if err != nil {
|
||||
return "", "", fmt.Errorf("error creating certificate: %s", err)
|
||||
}
|
||||
|
||||
certBuffer := bytes.Buffer{}
|
||||
if err := pem.Encode(
|
||||
&certBuffer,
|
||||
&pem.Block{Type: "CERTIFICATE", Bytes: derBytes},
|
||||
); err != nil {
|
||||
return "", "", fmt.Errorf("error pem-encoding certificate: %s", err)
|
||||
}
|
||||
|
||||
keyBuffer := bytes.Buffer{}
|
||||
if err := pem.Encode(
|
||||
&keyBuffer,
|
||||
pemBlockForKey(signeeKey),
|
||||
); err != nil {
|
||||
return "", "", fmt.Errorf("error pem-encoding key: %s", err)
|
||||
}
|
||||
|
||||
return certBuffer.String(), keyBuffer.String(), nil
|
||||
}
|
||||
|
||||
func getBaseCertTemplate(
|
||||
cn string,
|
||||
ips []interface{},
|
||||
alternateDNS []interface{},
|
||||
daysValid int,
|
||||
) (*x509.Certificate, error) {
|
||||
ipAddresses, err := getNetIPs(ips)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
dnsNames, err := getAlternateDNSStrs(alternateDNS)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
serialNumberUpperBound := new(big.Int).Lsh(big.NewInt(1), 128)
|
||||
serialNumber, err := rand.Int(rand.Reader, serialNumberUpperBound)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
return &x509.Certificate{
|
||||
SerialNumber: serialNumber,
|
||||
Subject: pkix.Name{
|
||||
CommonName: cn,
|
||||
},
|
||||
IPAddresses: ipAddresses,
|
||||
DNSNames: dnsNames,
|
||||
NotBefore: time.Now(),
|
||||
NotAfter: time.Now().Add(time.Hour * 24 * time.Duration(daysValid)),
|
||||
KeyUsage: x509.KeyUsageKeyEncipherment | x509.KeyUsageDigitalSignature,
|
||||
ExtKeyUsage: []x509.ExtKeyUsage{
|
||||
x509.ExtKeyUsageServerAuth,
|
||||
x509.ExtKeyUsageClientAuth,
|
||||
},
|
||||
BasicConstraintsValid: true,
|
||||
}, nil
|
||||
}
|
||||
|
||||
func getNetIPs(ips []interface{}) ([]net.IP, error) {
|
||||
if ips == nil {
|
||||
return []net.IP{}, nil
|
||||
}
|
||||
var ipStr string
|
||||
var ok bool
|
||||
var netIP net.IP
|
||||
netIPs := make([]net.IP, len(ips))
|
||||
for i, ip := range ips {
|
||||
ipStr, ok = ip.(string)
|
||||
if !ok {
|
||||
return nil, fmt.Errorf("error parsing ip: %v is not a string", ip)
|
||||
}
|
||||
netIP = net.ParseIP(ipStr)
|
||||
if netIP == nil {
|
||||
return nil, fmt.Errorf("error parsing ip: %s", ipStr)
|
||||
}
|
||||
netIPs[i] = netIP
|
||||
}
|
||||
return netIPs, nil
|
||||
}
|
||||
|
||||
func getAlternateDNSStrs(alternateDNS []interface{}) ([]string, error) {
|
||||
if alternateDNS == nil {
|
||||
return []string{}, nil
|
||||
}
|
||||
var dnsStr string
|
||||
var ok bool
|
||||
alternateDNSStrs := make([]string, len(alternateDNS))
|
||||
for i, dns := range alternateDNS {
|
||||
dnsStr, ok = dns.(string)
|
||||
if !ok {
|
||||
return nil, fmt.Errorf(
|
||||
"error processing alternate dns name: %v is not a string",
|
||||
dns,
|
||||
)
|
||||
}
|
||||
alternateDNSStrs[i] = dnsStr
|
||||
}
|
||||
return alternateDNSStrs, nil
|
||||
}
|
||||
|
||||
func encryptAES(password string, plaintext string) (string, error) {
|
||||
if plaintext == "" {
|
||||
return "", nil
|
||||
}
|
||||
|
||||
key := make([]byte, 32)
|
||||
copy(key, []byte(password))
|
||||
block, err := aes.NewCipher(key)
|
||||
if err != nil {
|
||||
return "", err
|
||||
}
|
||||
|
||||
content := []byte(plaintext)
|
||||
blockSize := block.BlockSize()
|
||||
padding := blockSize - len(content)%blockSize
|
||||
padtext := bytes.Repeat([]byte{byte(padding)}, padding)
|
||||
content = append(content, padtext...)
|
||||
|
||||
ciphertext := make([]byte, aes.BlockSize+len(content))
|
||||
|
||||
iv := ciphertext[:aes.BlockSize]
|
||||
if _, err := io.ReadFull(rand.Reader, iv); err != nil {
|
||||
return "", err
|
||||
}
|
||||
|
||||
mode := cipher.NewCBCEncrypter(block, iv)
|
||||
mode.CryptBlocks(ciphertext[aes.BlockSize:], content)
|
||||
|
||||
return base64.StdEncoding.EncodeToString(ciphertext), nil
|
||||
}
|
||||
|
||||
func decryptAES(password string, crypt64 string) (string, error) {
|
||||
if crypt64 == "" {
|
||||
return "", nil
|
||||
}
|
||||
|
||||
key := make([]byte, 32)
|
||||
copy(key, []byte(password))
|
||||
|
||||
crypt, err := base64.StdEncoding.DecodeString(crypt64)
|
||||
if err != nil {
|
||||
return "", err
|
||||
}
|
||||
|
||||
block, err := aes.NewCipher(key)
|
||||
if err != nil {
|
||||
return "", err
|
||||
}
|
||||
|
||||
iv := crypt[:aes.BlockSize]
|
||||
crypt = crypt[aes.BlockSize:]
|
||||
decrypted := make([]byte, len(crypt))
|
||||
mode := cipher.NewCBCDecrypter(block, iv)
|
||||
mode.CryptBlocks(decrypted, crypt)
|
||||
|
||||
return string(decrypted[:len(decrypted)-int(decrypted[len(decrypted)-1])]), nil
|
||||
}
|
||||
152
vendor/github.com/Masterminds/sprig/v3/date.go
generated
vendored
Normal file
152
vendor/github.com/Masterminds/sprig/v3/date.go
generated
vendored
Normal file
|
|
@ -0,0 +1,152 @@
|
|||
package sprig
|
||||
|
||||
import (
|
||||
"strconv"
|
||||
"time"
|
||||
)
|
||||
|
||||
// Given a format and a date, format the date string.
|
||||
//
|
||||
// Date can be a `time.Time` or an `int, int32, int64`.
|
||||
// In the later case, it is treated as seconds since UNIX
|
||||
// epoch.
|
||||
func date(fmt string, date interface{}) string {
|
||||
return dateInZone(fmt, date, "Local")
|
||||
}
|
||||
|
||||
func htmlDate(date interface{}) string {
|
||||
return dateInZone("2006-01-02", date, "Local")
|
||||
}
|
||||
|
||||
func htmlDateInZone(date interface{}, zone string) string {
|
||||
return dateInZone("2006-01-02", date, zone)
|
||||
}
|
||||
|
||||
func dateInZone(fmt string, date interface{}, zone string) string {
|
||||
var t time.Time
|
||||
switch date := date.(type) {
|
||||
default:
|
||||
t = time.Now()
|
||||
case time.Time:
|
||||
t = date
|
||||
case *time.Time:
|
||||
t = *date
|
||||
case int64:
|
||||
t = time.Unix(date, 0)
|
||||
case int:
|
||||
t = time.Unix(int64(date), 0)
|
||||
case int32:
|
||||
t = time.Unix(int64(date), 0)
|
||||
}
|
||||
|
||||
loc, err := time.LoadLocation(zone)
|
||||
if err != nil {
|
||||
loc, _ = time.LoadLocation("UTC")
|
||||
}
|
||||
|
||||
return t.In(loc).Format(fmt)
|
||||
}
|
||||
|
||||
func dateModify(fmt string, date time.Time) time.Time {
|
||||
d, err := time.ParseDuration(fmt)
|
||||
if err != nil {
|
||||
return date
|
||||
}
|
||||
return date.Add(d)
|
||||
}
|
||||
|
||||
func mustDateModify(fmt string, date time.Time) (time.Time, error) {
|
||||
d, err := time.ParseDuration(fmt)
|
||||
if err != nil {
|
||||
return time.Time{}, err
|
||||
}
|
||||
return date.Add(d), nil
|
||||
}
|
||||
|
||||
func dateAgo(date interface{}) string {
|
||||
var t time.Time
|
||||
|
||||
switch date := date.(type) {
|
||||
default:
|
||||
t = time.Now()
|
||||
case time.Time:
|
||||
t = date
|
||||
case int64:
|
||||
t = time.Unix(date, 0)
|
||||
case int:
|
||||
t = time.Unix(int64(date), 0)
|
||||
}
|
||||
// Drop resolution to seconds
|
||||
duration := time.Since(t).Round(time.Second)
|
||||
return duration.String()
|
||||
}
|
||||
|
||||
func duration(sec interface{}) string {
|
||||
var n int64
|
||||
switch value := sec.(type) {
|
||||
default:
|
||||
n = 0
|
||||
case string:
|
||||
n, _ = strconv.ParseInt(value, 10, 64)
|
||||
case int64:
|
||||
n = value
|
||||
}
|
||||
return (time.Duration(n) * time.Second).String()
|
||||
}
|
||||
|
||||
func durationRound(duration interface{}) string {
|
||||
var d time.Duration
|
||||
switch duration := duration.(type) {
|
||||
default:
|
||||
d = 0
|
||||
case string:
|
||||
d, _ = time.ParseDuration(duration)
|
||||
case int64:
|
||||
d = time.Duration(duration)
|
||||
case time.Time:
|
||||
d = time.Since(duration)
|
||||
}
|
||||
|
||||
u := uint64(d)
|
||||
neg := d < 0
|
||||
if neg {
|
||||
u = -u
|
||||
}
|
||||
|
||||
var (
|
||||
year = uint64(time.Hour) * 24 * 365
|
||||
month = uint64(time.Hour) * 24 * 30
|
||||
day = uint64(time.Hour) * 24
|
||||
hour = uint64(time.Hour)
|
||||
minute = uint64(time.Minute)
|
||||
second = uint64(time.Second)
|
||||
)
|
||||
switch {
|
||||
case u > year:
|
||||
return strconv.FormatUint(u/year, 10) + "y"
|
||||
case u > month:
|
||||
return strconv.FormatUint(u/month, 10) + "mo"
|
||||
case u > day:
|
||||
return strconv.FormatUint(u/day, 10) + "d"
|
||||
case u > hour:
|
||||
return strconv.FormatUint(u/hour, 10) + "h"
|
||||
case u > minute:
|
||||
return strconv.FormatUint(u/minute, 10) + "m"
|
||||
case u > second:
|
||||
return strconv.FormatUint(u/second, 10) + "s"
|
||||
}
|
||||
return "0s"
|
||||
}
|
||||
|
||||
func toDate(fmt, str string) time.Time {
|
||||
t, _ := time.ParseInLocation(fmt, str, time.Local)
|
||||
return t
|
||||
}
|
||||
|
||||
func mustToDate(fmt, str string) (time.Time, error) {
|
||||
return time.ParseInLocation(fmt, str, time.Local)
|
||||
}
|
||||
|
||||
func unixEpoch(date time.Time) string {
|
||||
return strconv.FormatInt(date.Unix(), 10)
|
||||
}
|
||||
163
vendor/github.com/Masterminds/sprig/v3/defaults.go
generated
vendored
Normal file
163
vendor/github.com/Masterminds/sprig/v3/defaults.go
generated
vendored
Normal file
|
|
@ -0,0 +1,163 @@
|
|||
package sprig
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"encoding/json"
|
||||
"math/rand"
|
||||
"reflect"
|
||||
"strings"
|
||||
"time"
|
||||
)
|
||||
|
||||
func init() {
|
||||
rand.Seed(time.Now().UnixNano())
|
||||
}
|
||||
|
||||
// dfault checks whether `given` is set, and returns default if not set.
|
||||
//
|
||||
// This returns `d` if `given` appears not to be set, and `given` otherwise.
|
||||
//
|
||||
// For numeric types 0 is unset.
|
||||
// For strings, maps, arrays, and slices, len() = 0 is considered unset.
|
||||
// For bool, false is unset.
|
||||
// Structs are never considered unset.
|
||||
//
|
||||
// For everything else, including pointers, a nil value is unset.
|
||||
func dfault(d interface{}, given ...interface{}) interface{} {
|
||||
|
||||
if empty(given) || empty(given[0]) {
|
||||
return d
|
||||
}
|
||||
return given[0]
|
||||
}
|
||||
|
||||
// empty returns true if the given value has the zero value for its type.
|
||||
func empty(given interface{}) bool {
|
||||
g := reflect.ValueOf(given)
|
||||
if !g.IsValid() {
|
||||
return true
|
||||
}
|
||||
|
||||
// Basically adapted from text/template.isTrue
|
||||
switch g.Kind() {
|
||||
default:
|
||||
return g.IsNil()
|
||||
case reflect.Array, reflect.Slice, reflect.Map, reflect.String:
|
||||
return g.Len() == 0
|
||||
case reflect.Bool:
|
||||
return !g.Bool()
|
||||
case reflect.Complex64, reflect.Complex128:
|
||||
return g.Complex() == 0
|
||||
case reflect.Int, reflect.Int8, reflect.Int16, reflect.Int32, reflect.Int64:
|
||||
return g.Int() == 0
|
||||
case reflect.Uint, reflect.Uint8, reflect.Uint16, reflect.Uint32, reflect.Uint64, reflect.Uintptr:
|
||||
return g.Uint() == 0
|
||||
case reflect.Float32, reflect.Float64:
|
||||
return g.Float() == 0
|
||||
case reflect.Struct:
|
||||
return false
|
||||
}
|
||||
}
|
||||
|
||||
// coalesce returns the first non-empty value.
|
||||
func coalesce(v ...interface{}) interface{} {
|
||||
for _, val := range v {
|
||||
if !empty(val) {
|
||||
return val
|
||||
}
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
// all returns true if empty(x) is false for all values x in the list.
|
||||
// If the list is empty, return true.
|
||||
func all(v ...interface{}) bool {
|
||||
for _, val := range v {
|
||||
if empty(val) {
|
||||
return false
|
||||
}
|
||||
}
|
||||
return true
|
||||
}
|
||||
|
||||
// any returns true if empty(x) is false for any x in the list.
|
||||
// If the list is empty, return false.
|
||||
func any(v ...interface{}) bool {
|
||||
for _, val := range v {
|
||||
if !empty(val) {
|
||||
return true
|
||||
}
|
||||
}
|
||||
return false
|
||||
}
|
||||
|
||||
// fromJson decodes JSON into a structured value, ignoring errors.
|
||||
func fromJson(v string) interface{} {
|
||||
output, _ := mustFromJson(v)
|
||||
return output
|
||||
}
|
||||
|
||||
// mustFromJson decodes JSON into a structured value, returning errors.
|
||||
func mustFromJson(v string) (interface{}, error) {
|
||||
var output interface{}
|
||||
err := json.Unmarshal([]byte(v), &output)
|
||||
return output, err
|
||||
}
|
||||
|
||||
// toJson encodes an item into a JSON string
|
||||
func toJson(v interface{}) string {
|
||||
output, _ := json.Marshal(v)
|
||||
return string(output)
|
||||
}
|
||||
|
||||
func mustToJson(v interface{}) (string, error) {
|
||||
output, err := json.Marshal(v)
|
||||
if err != nil {
|
||||
return "", err
|
||||
}
|
||||
return string(output), nil
|
||||
}
|
||||
|
||||
// toPrettyJson encodes an item into a pretty (indented) JSON string
|
||||
func toPrettyJson(v interface{}) string {
|
||||
output, _ := json.MarshalIndent(v, "", " ")
|
||||
return string(output)
|
||||
}
|
||||
|
||||
func mustToPrettyJson(v interface{}) (string, error) {
|
||||
output, err := json.MarshalIndent(v, "", " ")
|
||||
if err != nil {
|
||||
return "", err
|
||||
}
|
||||
return string(output), nil
|
||||
}
|
||||
|
||||
// toRawJson encodes an item into a JSON string with no escaping of HTML characters.
|
||||
func toRawJson(v interface{}) string {
|
||||
output, err := mustToRawJson(v)
|
||||
if err != nil {
|
||||
panic(err)
|
||||
}
|
||||
return string(output)
|
||||
}
|
||||
|
||||
// mustToRawJson encodes an item into a JSON string with no escaping of HTML characters.
|
||||
func mustToRawJson(v interface{}) (string, error) {
|
||||
buf := new(bytes.Buffer)
|
||||
enc := json.NewEncoder(buf)
|
||||
enc.SetEscapeHTML(false)
|
||||
err := enc.Encode(&v)
|
||||
if err != nil {
|
||||
return "", err
|
||||
}
|
||||
return strings.TrimSuffix(buf.String(), "\n"), nil
|
||||
}
|
||||
|
||||
// ternary returns the first value if the last value is true, otherwise returns the second value.
|
||||
func ternary(vt interface{}, vf interface{}, v bool) interface{} {
|
||||
if v {
|
||||
return vt
|
||||
}
|
||||
|
||||
return vf
|
||||
}
|
||||
174
vendor/github.com/Masterminds/sprig/v3/dict.go
generated
vendored
Normal file
174
vendor/github.com/Masterminds/sprig/v3/dict.go
generated
vendored
Normal file
|
|
@ -0,0 +1,174 @@
|
|||
package sprig
|
||||
|
||||
import (
|
||||
"github.com/imdario/mergo"
|
||||
"github.com/mitchellh/copystructure"
|
||||
)
|
||||
|
||||
func get(d map[string]interface{}, key string) interface{} {
|
||||
if val, ok := d[key]; ok {
|
||||
return val
|
||||
}
|
||||
return ""
|
||||
}
|
||||
|
||||
func set(d map[string]interface{}, key string, value interface{}) map[string]interface{} {
|
||||
d[key] = value
|
||||
return d
|
||||
}
|
||||
|
||||
func unset(d map[string]interface{}, key string) map[string]interface{} {
|
||||
delete(d, key)
|
||||
return d
|
||||
}
|
||||
|
||||
func hasKey(d map[string]interface{}, key string) bool {
|
||||
_, ok := d[key]
|
||||
return ok
|
||||
}
|
||||
|
||||
func pluck(key string, d ...map[string]interface{}) []interface{} {
|
||||
res := []interface{}{}
|
||||
for _, dict := range d {
|
||||
if val, ok := dict[key]; ok {
|
||||
res = append(res, val)
|
||||
}
|
||||
}
|
||||
return res
|
||||
}
|
||||
|
||||
func keys(dicts ...map[string]interface{}) []string {
|
||||
k := []string{}
|
||||
for _, dict := range dicts {
|
||||
for key := range dict {
|
||||
k = append(k, key)
|
||||
}
|
||||
}
|
||||
return k
|
||||
}
|
||||
|
||||
func pick(dict map[string]interface{}, keys ...string) map[string]interface{} {
|
||||
res := map[string]interface{}{}
|
||||
for _, k := range keys {
|
||||
if v, ok := dict[k]; ok {
|
||||
res[k] = v
|
||||
}
|
||||
}
|
||||
return res
|
||||
}
|
||||
|
||||
func omit(dict map[string]interface{}, keys ...string) map[string]interface{} {
|
||||
res := map[string]interface{}{}
|
||||
|
||||
omit := make(map[string]bool, len(keys))
|
||||
for _, k := range keys {
|
||||
omit[k] = true
|
||||
}
|
||||
|
||||
for k, v := range dict {
|
||||
if _, ok := omit[k]; !ok {
|
||||
res[k] = v
|
||||
}
|
||||
}
|
||||
return res
|
||||
}
|
||||
|
||||
func dict(v ...interface{}) map[string]interface{} {
|
||||
dict := map[string]interface{}{}
|
||||
lenv := len(v)
|
||||
for i := 0; i < lenv; i += 2 {
|
||||
key := strval(v[i])
|
||||
if i+1 >= lenv {
|
||||
dict[key] = ""
|
||||
continue
|
||||
}
|
||||
dict[key] = v[i+1]
|
||||
}
|
||||
return dict
|
||||
}
|
||||
|
||||
func merge(dst map[string]interface{}, srcs ...map[string]interface{}) interface{} {
|
||||
for _, src := range srcs {
|
||||
if err := mergo.Merge(&dst, src); err != nil {
|
||||
// Swallow errors inside of a template.
|
||||
return ""
|
||||
}
|
||||
}
|
||||
return dst
|
||||
}
|
||||
|
||||
func mustMerge(dst map[string]interface{}, srcs ...map[string]interface{}) (interface{}, error) {
|
||||
for _, src := range srcs {
|
||||
if err := mergo.Merge(&dst, src); err != nil {
|
||||
return nil, err
|
||||
}
|
||||
}
|
||||
return dst, nil
|
||||
}
|
||||
|
||||
func mergeOverwrite(dst map[string]interface{}, srcs ...map[string]interface{}) interface{} {
|
||||
for _, src := range srcs {
|
||||
if err := mergo.MergeWithOverwrite(&dst, src); err != nil {
|
||||
// Swallow errors inside of a template.
|
||||
return ""
|
||||
}
|
||||
}
|
||||
return dst
|
||||
}
|
||||
|
||||
func mustMergeOverwrite(dst map[string]interface{}, srcs ...map[string]interface{}) (interface{}, error) {
|
||||
for _, src := range srcs {
|
||||
if err := mergo.MergeWithOverwrite(&dst, src); err != nil {
|
||||
return nil, err
|
||||
}
|
||||
}
|
||||
return dst, nil
|
||||
}
|
||||
|
||||
func values(dict map[string]interface{}) []interface{} {
|
||||
values := []interface{}{}
|
||||
for _, value := range dict {
|
||||
values = append(values, value)
|
||||
}
|
||||
|
||||
return values
|
||||
}
|
||||
|
||||
func deepCopy(i interface{}) interface{} {
|
||||
c, err := mustDeepCopy(i)
|
||||
if err != nil {
|
||||
panic("deepCopy error: " + err.Error())
|
||||
}
|
||||
|
||||
return c
|
||||
}
|
||||
|
||||
func mustDeepCopy(i interface{}) (interface{}, error) {
|
||||
return copystructure.Copy(i)
|
||||
}
|
||||
|
||||
func dig(ps ...interface{}) (interface{}, error) {
|
||||
if len(ps) < 3 {
|
||||
panic("dig needs at least three arguments")
|
||||
}
|
||||
dict := ps[len(ps)-1].(map[string]interface{})
|
||||
def := ps[len(ps)-2]
|
||||
ks := make([]string, len(ps)-2)
|
||||
for i := 0; i < len(ks); i++ {
|
||||
ks[i] = ps[i].(string)
|
||||
}
|
||||
|
||||
return digFromDict(dict, def, ks)
|
||||
}
|
||||
|
||||
func digFromDict(dict map[string]interface{}, d interface{}, ks []string) (interface{}, error) {
|
||||
k, ns := ks[0], ks[1:len(ks)]
|
||||
step, has := dict[k]
|
||||
if !has {
|
||||
return d, nil
|
||||
}
|
||||
if len(ns) == 0 {
|
||||
return step, nil
|
||||
}
|
||||
return digFromDict(step.(map[string]interface{}), d, ns)
|
||||
}
|
||||
19
vendor/github.com/Masterminds/sprig/v3/doc.go
generated
vendored
Normal file
19
vendor/github.com/Masterminds/sprig/v3/doc.go
generated
vendored
Normal file
|
|
@ -0,0 +1,19 @@
|
|||
/*
|
||||
Package sprig provides template functions for Go.
|
||||
|
||||
This package contains a number of utility functions for working with data
|
||||
inside of Go `html/template` and `text/template` files.
|
||||
|
||||
To add these functions, use the `template.Funcs()` method:
|
||||
|
||||
t := templates.New("foo").Funcs(sprig.FuncMap())
|
||||
|
||||
Note that you should add the function map before you parse any template files.
|
||||
|
||||
In several cases, Sprig reverses the order of arguments from the way they
|
||||
appear in the standard library. This is to make it easier to pipe
|
||||
arguments into functions.
|
||||
|
||||
See http://masterminds.github.io/sprig/ for more detailed documentation on each of the available functions.
|
||||
*/
|
||||
package sprig
|
||||
382
vendor/github.com/Masterminds/sprig/v3/functions.go
generated
vendored
Normal file
382
vendor/github.com/Masterminds/sprig/v3/functions.go
generated
vendored
Normal file
|
|
@ -0,0 +1,382 @@
|
|||
package sprig
|
||||
|
||||
import (
|
||||
"errors"
|
||||
"html/template"
|
||||
"math/rand"
|
||||
"os"
|
||||
"path"
|
||||
"path/filepath"
|
||||
"reflect"
|
||||
"strconv"
|
||||
"strings"
|
||||
ttemplate "text/template"
|
||||
"time"
|
||||
|
||||
util "github.com/Masterminds/goutils"
|
||||
"github.com/huandu/xstrings"
|
||||
"github.com/shopspring/decimal"
|
||||
)
|
||||
|
||||
// FuncMap produces the function map.
|
||||
//
|
||||
// Use this to pass the functions into the template engine:
|
||||
//
|
||||
// tpl := template.New("foo").Funcs(sprig.FuncMap()))
|
||||
//
|
||||
func FuncMap() template.FuncMap {
|
||||
return HtmlFuncMap()
|
||||
}
|
||||
|
||||
// HermeticTxtFuncMap returns a 'text/template'.FuncMap with only repeatable functions.
|
||||
func HermeticTxtFuncMap() ttemplate.FuncMap {
|
||||
r := TxtFuncMap()
|
||||
for _, name := range nonhermeticFunctions {
|
||||
delete(r, name)
|
||||
}
|
||||
return r
|
||||
}
|
||||
|
||||
// HermeticHtmlFuncMap returns an 'html/template'.Funcmap with only repeatable functions.
|
||||
func HermeticHtmlFuncMap() template.FuncMap {
|
||||
r := HtmlFuncMap()
|
||||
for _, name := range nonhermeticFunctions {
|
||||
delete(r, name)
|
||||
}
|
||||
return r
|
||||
}
|
||||
|
||||
// TxtFuncMap returns a 'text/template'.FuncMap
|
||||
func TxtFuncMap() ttemplate.FuncMap {
|
||||
return ttemplate.FuncMap(GenericFuncMap())
|
||||
}
|
||||
|
||||
// HtmlFuncMap returns an 'html/template'.Funcmap
|
||||
func HtmlFuncMap() template.FuncMap {
|
||||
return template.FuncMap(GenericFuncMap())
|
||||
}
|
||||
|
||||
// GenericFuncMap returns a copy of the basic function map as a map[string]interface{}.
|
||||
func GenericFuncMap() map[string]interface{} {
|
||||
gfm := make(map[string]interface{}, len(genericMap))
|
||||
for k, v := range genericMap {
|
||||
gfm[k] = v
|
||||
}
|
||||
return gfm
|
||||
}
|
||||
|
||||
// These functions are not guaranteed to evaluate to the same result for given input, because they
|
||||
// refer to the environment or global state.
|
||||
var nonhermeticFunctions = []string{
|
||||
// Date functions
|
||||
"date",
|
||||
"date_in_zone",
|
||||
"date_modify",
|
||||
"now",
|
||||
"htmlDate",
|
||||
"htmlDateInZone",
|
||||
"dateInZone",
|
||||
"dateModify",
|
||||
|
||||
// Strings
|
||||
"randAlphaNum",
|
||||
"randAlpha",
|
||||
"randAscii",
|
||||
"randNumeric",
|
||||
"randBytes",
|
||||
"uuidv4",
|
||||
|
||||
// OS
|
||||
"env",
|
||||
"expandenv",
|
||||
|
||||
// Network
|
||||
"getHostByName",
|
||||
}
|
||||
|
||||
var genericMap = map[string]interface{}{
|
||||
"hello": func() string { return "Hello!" },
|
||||
|
||||
// Date functions
|
||||
"ago": dateAgo,
|
||||
"date": date,
|
||||
"date_in_zone": dateInZone,
|
||||
"date_modify": dateModify,
|
||||
"dateInZone": dateInZone,
|
||||
"dateModify": dateModify,
|
||||
"duration": duration,
|
||||
"durationRound": durationRound,
|
||||
"htmlDate": htmlDate,
|
||||
"htmlDateInZone": htmlDateInZone,
|
||||
"must_date_modify": mustDateModify,
|
||||
"mustDateModify": mustDateModify,
|
||||
"mustToDate": mustToDate,
|
||||
"now": time.Now,
|
||||
"toDate": toDate,
|
||||
"unixEpoch": unixEpoch,
|
||||
|
||||
// Strings
|
||||
"abbrev": abbrev,
|
||||
"abbrevboth": abbrevboth,
|
||||
"trunc": trunc,
|
||||
"trim": strings.TrimSpace,
|
||||
"upper": strings.ToUpper,
|
||||
"lower": strings.ToLower,
|
||||
"title": strings.Title,
|
||||
"untitle": untitle,
|
||||
"substr": substring,
|
||||
// Switch order so that "foo" | repeat 5
|
||||
"repeat": func(count int, str string) string { return strings.Repeat(str, count) },
|
||||
// Deprecated: Use trimAll.
|
||||
"trimall": func(a, b string) string { return strings.Trim(b, a) },
|
||||
// Switch order so that "$foo" | trimall "$"
|
||||
"trimAll": func(a, b string) string { return strings.Trim(b, a) },
|
||||
"trimSuffix": func(a, b string) string { return strings.TrimSuffix(b, a) },
|
||||
"trimPrefix": func(a, b string) string { return strings.TrimPrefix(b, a) },
|
||||
"nospace": util.DeleteWhiteSpace,
|
||||
"initials": initials,
|
||||
"randAlphaNum": randAlphaNumeric,
|
||||
"randAlpha": randAlpha,
|
||||
"randAscii": randAscii,
|
||||
"randNumeric": randNumeric,
|
||||
"swapcase": util.SwapCase,
|
||||
"shuffle": xstrings.Shuffle,
|
||||
"snakecase": xstrings.ToSnakeCase,
|
||||
"camelcase": xstrings.ToCamelCase,
|
||||
"kebabcase": xstrings.ToKebabCase,
|
||||
"wrap": func(l int, s string) string { return util.Wrap(s, l) },
|
||||
"wrapWith": func(l int, sep, str string) string { return util.WrapCustom(str, l, sep, true) },
|
||||
// Switch order so that "foobar" | contains "foo"
|
||||
"contains": func(substr string, str string) bool { return strings.Contains(str, substr) },
|
||||
"hasPrefix": func(substr string, str string) bool { return strings.HasPrefix(str, substr) },
|
||||
"hasSuffix": func(substr string, str string) bool { return strings.HasSuffix(str, substr) },
|
||||
"quote": quote,
|
||||
"squote": squote,
|
||||
"cat": cat,
|
||||
"indent": indent,
|
||||
"nindent": nindent,
|
||||
"replace": replace,
|
||||
"plural": plural,
|
||||
"sha1sum": sha1sum,
|
||||
"sha256sum": sha256sum,
|
||||
"adler32sum": adler32sum,
|
||||
"toString": strval,
|
||||
|
||||
// Wrap Atoi to stop errors.
|
||||
"atoi": func(a string) int { i, _ := strconv.Atoi(a); return i },
|
||||
"int64": toInt64,
|
||||
"int": toInt,
|
||||
"float64": toFloat64,
|
||||
"seq": seq,
|
||||
"toDecimal": toDecimal,
|
||||
|
||||
//"gt": func(a, b int) bool {return a > b},
|
||||
//"gte": func(a, b int) bool {return a >= b},
|
||||
//"lt": func(a, b int) bool {return a < b},
|
||||
//"lte": func(a, b int) bool {return a <= b},
|
||||
|
||||
// split "/" foo/bar returns map[int]string{0: foo, 1: bar}
|
||||
"split": split,
|
||||
"splitList": func(sep, orig string) []string { return strings.Split(orig, sep) },
|
||||
// splitn "/" foo/bar/fuu returns map[int]string{0: foo, 1: bar/fuu}
|
||||
"splitn": splitn,
|
||||
"toStrings": strslice,
|
||||
|
||||
"until": until,
|
||||
"untilStep": untilStep,
|
||||
|
||||
// VERY basic arithmetic.
|
||||
"add1": func(i interface{}) int64 { return toInt64(i) + 1 },
|
||||
"add": func(i ...interface{}) int64 {
|
||||
var a int64 = 0
|
||||
for _, b := range i {
|
||||
a += toInt64(b)
|
||||
}
|
||||
return a
|
||||
},
|
||||
"sub": func(a, b interface{}) int64 { return toInt64(a) - toInt64(b) },
|
||||
"div": func(a, b interface{}) int64 { return toInt64(a) / toInt64(b) },
|
||||
"mod": func(a, b interface{}) int64 { return toInt64(a) % toInt64(b) },
|
||||
"mul": func(a interface{}, v ...interface{}) int64 {
|
||||
val := toInt64(a)
|
||||
for _, b := range v {
|
||||
val = val * toInt64(b)
|
||||
}
|
||||
return val
|
||||
},
|
||||
"randInt": func(min, max int) int { return rand.Intn(max-min) + min },
|
||||
"add1f": func(i interface{}) float64 {
|
||||
return execDecimalOp(i, []interface{}{1}, func(d1, d2 decimal.Decimal) decimal.Decimal { return d1.Add(d2) })
|
||||
},
|
||||
"addf": func(i ...interface{}) float64 {
|
||||
a := interface{}(float64(0))
|
||||
return execDecimalOp(a, i, func(d1, d2 decimal.Decimal) decimal.Decimal { return d1.Add(d2) })
|
||||
},
|
||||
"subf": func(a interface{}, v ...interface{}) float64 {
|
||||
return execDecimalOp(a, v, func(d1, d2 decimal.Decimal) decimal.Decimal { return d1.Sub(d2) })
|
||||
},
|
||||
"divf": func(a interface{}, v ...interface{}) float64 {
|
||||
return execDecimalOp(a, v, func(d1, d2 decimal.Decimal) decimal.Decimal { return d1.Div(d2) })
|
||||
},
|
||||
"mulf": func(a interface{}, v ...interface{}) float64 {
|
||||
return execDecimalOp(a, v, func(d1, d2 decimal.Decimal) decimal.Decimal { return d1.Mul(d2) })
|
||||
},
|
||||
"biggest": max,
|
||||
"max": max,
|
||||
"min": min,
|
||||
"maxf": maxf,
|
||||
"minf": minf,
|
||||
"ceil": ceil,
|
||||
"floor": floor,
|
||||
"round": round,
|
||||
|
||||
// string slices. Note that we reverse the order b/c that's better
|
||||
// for template processing.
|
||||
"join": join,
|
||||
"sortAlpha": sortAlpha,
|
||||
|
||||
// Defaults
|
||||
"default": dfault,
|
||||
"empty": empty,
|
||||
"coalesce": coalesce,
|
||||
"all": all,
|
||||
"any": any,
|
||||
"compact": compact,
|
||||
"mustCompact": mustCompact,
|
||||
"fromJson": fromJson,
|
||||
"toJson": toJson,
|
||||
"toPrettyJson": toPrettyJson,
|
||||
"toRawJson": toRawJson,
|
||||
"mustFromJson": mustFromJson,
|
||||
"mustToJson": mustToJson,
|
||||
"mustToPrettyJson": mustToPrettyJson,
|
||||
"mustToRawJson": mustToRawJson,
|
||||
"ternary": ternary,
|
||||
"deepCopy": deepCopy,
|
||||
"mustDeepCopy": mustDeepCopy,
|
||||
|
||||
// Reflection
|
||||
"typeOf": typeOf,
|
||||
"typeIs": typeIs,
|
||||
"typeIsLike": typeIsLike,
|
||||
"kindOf": kindOf,
|
||||
"kindIs": kindIs,
|
||||
"deepEqual": reflect.DeepEqual,
|
||||
|
||||
// OS:
|
||||
"env": os.Getenv,
|
||||
"expandenv": os.ExpandEnv,
|
||||
|
||||
// Network:
|
||||
"getHostByName": getHostByName,
|
||||
|
||||
// Paths:
|
||||
"base": path.Base,
|
||||
"dir": path.Dir,
|
||||
"clean": path.Clean,
|
||||
"ext": path.Ext,
|
||||
"isAbs": path.IsAbs,
|
||||
|
||||
// Filepaths:
|
||||
"osBase": filepath.Base,
|
||||
"osClean": filepath.Clean,
|
||||
"osDir": filepath.Dir,
|
||||
"osExt": filepath.Ext,
|
||||
"osIsAbs": filepath.IsAbs,
|
||||
|
||||
// Encoding:
|
||||
"b64enc": base64encode,
|
||||
"b64dec": base64decode,
|
||||
"b32enc": base32encode,
|
||||
"b32dec": base32decode,
|
||||
|
||||
// Data Structures:
|
||||
"tuple": list, // FIXME: with the addition of append/prepend these are no longer immutable.
|
||||
"list": list,
|
||||
"dict": dict,
|
||||
"get": get,
|
||||
"set": set,
|
||||
"unset": unset,
|
||||
"hasKey": hasKey,
|
||||
"pluck": pluck,
|
||||
"keys": keys,
|
||||
"pick": pick,
|
||||
"omit": omit,
|
||||
"merge": merge,
|
||||
"mergeOverwrite": mergeOverwrite,
|
||||
"mustMerge": mustMerge,
|
||||
"mustMergeOverwrite": mustMergeOverwrite,
|
||||
"values": values,
|
||||
|
||||
"append": push, "push": push,
|
||||
"mustAppend": mustPush, "mustPush": mustPush,
|
||||
"prepend": prepend,
|
||||
"mustPrepend": mustPrepend,
|
||||
"first": first,
|
||||
"mustFirst": mustFirst,
|
||||
"rest": rest,
|
||||
"mustRest": mustRest,
|
||||
"last": last,
|
||||
"mustLast": mustLast,
|
||||
"initial": initial,
|
||||
"mustInitial": mustInitial,
|
||||
"reverse": reverse,
|
||||
"mustReverse": mustReverse,
|
||||
"uniq": uniq,
|
||||
"mustUniq": mustUniq,
|
||||
"without": without,
|
||||
"mustWithout": mustWithout,
|
||||
"has": has,
|
||||
"mustHas": mustHas,
|
||||
"slice": slice,
|
||||
"mustSlice": mustSlice,
|
||||
"concat": concat,
|
||||
"dig": dig,
|
||||
"chunk": chunk,
|
||||
"mustChunk": mustChunk,
|
||||
|
||||
// Crypto:
|
||||
"bcrypt": bcrypt,
|
||||
"htpasswd": htpasswd,
|
||||
"genPrivateKey": generatePrivateKey,
|
||||
"derivePassword": derivePassword,
|
||||
"buildCustomCert": buildCustomCertificate,
|
||||
"genCA": generateCertificateAuthority,
|
||||
"genCAWithKey": generateCertificateAuthorityWithPEMKey,
|
||||
"genSelfSignedCert": generateSelfSignedCertificate,
|
||||
"genSelfSignedCertWithKey": generateSelfSignedCertificateWithPEMKey,
|
||||
"genSignedCert": generateSignedCertificate,
|
||||
"genSignedCertWithKey": generateSignedCertificateWithPEMKey,
|
||||
"encryptAES": encryptAES,
|
||||
"decryptAES": decryptAES,
|
||||
"randBytes": randBytes,
|
||||
|
||||
// UUIDs:
|
||||
"uuidv4": uuidv4,
|
||||
|
||||
// SemVer:
|
||||
"semver": semver,
|
||||
"semverCompare": semverCompare,
|
||||
|
||||
// Flow Control:
|
||||
"fail": func(msg string) (string, error) { return "", errors.New(msg) },
|
||||
|
||||
// Regex
|
||||
"regexMatch": regexMatch,
|
||||
"mustRegexMatch": mustRegexMatch,
|
||||
"regexFindAll": regexFindAll,
|
||||
"mustRegexFindAll": mustRegexFindAll,
|
||||
"regexFind": regexFind,
|
||||
"mustRegexFind": mustRegexFind,
|
||||
"regexReplaceAll": regexReplaceAll,
|
||||
"mustRegexReplaceAll": mustRegexReplaceAll,
|
||||
"regexReplaceAllLiteral": regexReplaceAllLiteral,
|
||||
"mustRegexReplaceAllLiteral": mustRegexReplaceAllLiteral,
|
||||
"regexSplit": regexSplit,
|
||||
"mustRegexSplit": mustRegexSplit,
|
||||
"regexQuoteMeta": regexQuoteMeta,
|
||||
|
||||
// URLs:
|
||||
"urlParse": urlParse,
|
||||
"urlJoin": urlJoin,
|
||||
}
|
||||
464
vendor/github.com/Masterminds/sprig/v3/list.go
generated
vendored
Normal file
464
vendor/github.com/Masterminds/sprig/v3/list.go
generated
vendored
Normal file
|
|
@ -0,0 +1,464 @@
|
|||
package sprig
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"math"
|
||||
"reflect"
|
||||
"sort"
|
||||
)
|
||||
|
||||
// Reflection is used in these functions so that slices and arrays of strings,
|
||||
// ints, and other types not implementing []interface{} can be worked with.
|
||||
// For example, this is useful if you need to work on the output of regexs.
|
||||
|
||||
func list(v ...interface{}) []interface{} {
|
||||
return v
|
||||
}
|
||||
|
||||
func push(list interface{}, v interface{}) []interface{} {
|
||||
l, err := mustPush(list, v)
|
||||
if err != nil {
|
||||
panic(err)
|
||||
}
|
||||
|
||||
return l
|
||||
}
|
||||
|
||||
func mustPush(list interface{}, v interface{}) ([]interface{}, error) {
|
||||
tp := reflect.TypeOf(list).Kind()
|
||||
switch tp {
|
||||
case reflect.Slice, reflect.Array:
|
||||
l2 := reflect.ValueOf(list)
|
||||
|
||||
l := l2.Len()
|
||||
nl := make([]interface{}, l)
|
||||
for i := 0; i < l; i++ {
|
||||
nl[i] = l2.Index(i).Interface()
|
||||
}
|
||||
|
||||
return append(nl, v), nil
|
||||
|
||||
default:
|
||||
return nil, fmt.Errorf("Cannot push on type %s", tp)
|
||||
}
|
||||
}
|
||||
|
||||
func prepend(list interface{}, v interface{}) []interface{} {
|
||||
l, err := mustPrepend(list, v)
|
||||
if err != nil {
|
||||
panic(err)
|
||||
}
|
||||
|
||||
return l
|
||||
}
|
||||
|
||||
func mustPrepend(list interface{}, v interface{}) ([]interface{}, error) {
|
||||
//return append([]interface{}{v}, list...)
|
||||
|
||||
tp := reflect.TypeOf(list).Kind()
|
||||
switch tp {
|
||||
case reflect.Slice, reflect.Array:
|
||||
l2 := reflect.ValueOf(list)
|
||||
|
||||
l := l2.Len()
|
||||
nl := make([]interface{}, l)
|
||||
for i := 0; i < l; i++ {
|
||||
nl[i] = l2.Index(i).Interface()
|
||||
}
|
||||
|
||||
return append([]interface{}{v}, nl...), nil
|
||||
|
||||
default:
|
||||
return nil, fmt.Errorf("Cannot prepend on type %s", tp)
|
||||
}
|
||||
}
|
||||
|
||||
func chunk(size int, list interface{}) [][]interface{} {
|
||||
l, err := mustChunk(size, list)
|
||||
if err != nil {
|
||||
panic(err)
|
||||
}
|
||||
|
||||
return l
|
||||
}
|
||||
|
||||
func mustChunk(size int, list interface{}) ([][]interface{}, error) {
|
||||
tp := reflect.TypeOf(list).Kind()
|
||||
switch tp {
|
||||
case reflect.Slice, reflect.Array:
|
||||
l2 := reflect.ValueOf(list)
|
||||
|
||||
l := l2.Len()
|
||||
|
||||
cs := int(math.Floor(float64(l-1)/float64(size)) + 1)
|
||||
nl := make([][]interface{}, cs)
|
||||
|
||||
for i := 0; i < cs; i++ {
|
||||
clen := size
|
||||
if i == cs-1 {
|
||||
clen = int(math.Floor(math.Mod(float64(l), float64(size))))
|
||||
if clen == 0 {
|
||||
clen = size
|
||||
}
|
||||
}
|
||||
|
||||
nl[i] = make([]interface{}, clen)
|
||||
|
||||
for j := 0; j < clen; j++ {
|
||||
ix := i*size + j
|
||||
nl[i][j] = l2.Index(ix).Interface()
|
||||
}
|
||||
}
|
||||
|
||||
return nl, nil
|
||||
|
||||
default:
|
||||
return nil, fmt.Errorf("Cannot chunk type %s", tp)
|
||||
}
|
||||
}
|
||||
|
||||
func last(list interface{}) interface{} {
|
||||
l, err := mustLast(list)
|
||||
if err != nil {
|
||||
panic(err)
|
||||
}
|
||||
|
||||
return l
|
||||
}
|
||||
|
||||
func mustLast(list interface{}) (interface{}, error) {
|
||||
tp := reflect.TypeOf(list).Kind()
|
||||
switch tp {
|
||||
case reflect.Slice, reflect.Array:
|
||||
l2 := reflect.ValueOf(list)
|
||||
|
||||
l := l2.Len()
|
||||
if l == 0 {
|
||||
return nil, nil
|
||||
}
|
||||
|
||||
return l2.Index(l - 1).Interface(), nil
|
||||
default:
|
||||
return nil, fmt.Errorf("Cannot find last on type %s", tp)
|
||||
}
|
||||
}
|
||||
|
||||
func first(list interface{}) interface{} {
|
||||
l, err := mustFirst(list)
|
||||
if err != nil {
|
||||
panic(err)
|
||||
}
|
||||
|
||||
return l
|
||||
}
|
||||
|
||||
func mustFirst(list interface{}) (interface{}, error) {
|
||||
tp := reflect.TypeOf(list).Kind()
|
||||
switch tp {
|
||||
case reflect.Slice, reflect.Array:
|
||||
l2 := reflect.ValueOf(list)
|
||||
|
||||
l := l2.Len()
|
||||
if l == 0 {
|
||||
return nil, nil
|
||||
}
|
||||
|
||||
return l2.Index(0).Interface(), nil
|
||||
default:
|
||||
return nil, fmt.Errorf("Cannot find first on type %s", tp)
|
||||
}
|
||||
}
|
||||
|
||||
func rest(list interface{}) []interface{} {
|
||||
l, err := mustRest(list)
|
||||
if err != nil {
|
||||
panic(err)
|
||||
}
|
||||
|
||||
return l
|
||||
}
|
||||
|
||||
func mustRest(list interface{}) ([]interface{}, error) {
|
||||
tp := reflect.TypeOf(list).Kind()
|
||||
switch tp {
|
||||
case reflect.Slice, reflect.Array:
|
||||
l2 := reflect.ValueOf(list)
|
||||
|
||||
l := l2.Len()
|
||||
if l == 0 {
|
||||
return nil, nil
|
||||
}
|
||||
|
||||
nl := make([]interface{}, l-1)
|
||||
for i := 1; i < l; i++ {
|
||||
nl[i-1] = l2.Index(i).Interface()
|
||||
}
|
||||
|
||||
return nl, nil
|
||||
default:
|
||||
return nil, fmt.Errorf("Cannot find rest on type %s", tp)
|
||||
}
|
||||
}
|
||||
|
||||
func initial(list interface{}) []interface{} {
|
||||
l, err := mustInitial(list)
|
||||
if err != nil {
|
||||
panic(err)
|
||||
}
|
||||
|
||||
return l
|
||||
}
|
||||
|
||||
func mustInitial(list interface{}) ([]interface{}, error) {
|
||||
tp := reflect.TypeOf(list).Kind()
|
||||
switch tp {
|
||||
case reflect.Slice, reflect.Array:
|
||||
l2 := reflect.ValueOf(list)
|
||||
|
||||
l := l2.Len()
|
||||
if l == 0 {
|
||||
return nil, nil
|
||||
}
|
||||
|
||||
nl := make([]interface{}, l-1)
|
||||
for i := 0; i < l-1; i++ {
|
||||
nl[i] = l2.Index(i).Interface()
|
||||
}
|
||||
|
||||
return nl, nil
|
||||
default:
|
||||
return nil, fmt.Errorf("Cannot find initial on type %s", tp)
|
||||
}
|
||||
}
|
||||
|
||||
func sortAlpha(list interface{}) []string {
|
||||
k := reflect.Indirect(reflect.ValueOf(list)).Kind()
|
||||
switch k {
|
||||
case reflect.Slice, reflect.Array:
|
||||
a := strslice(list)
|
||||
s := sort.StringSlice(a)
|
||||
s.Sort()
|
||||
return s
|
||||
}
|
||||
return []string{strval(list)}
|
||||
}
|
||||
|
||||
func reverse(v interface{}) []interface{} {
|
||||
l, err := mustReverse(v)
|
||||
if err != nil {
|
||||
panic(err)
|
||||
}
|
||||
|
||||
return l
|
||||
}
|
||||
|
||||
func mustReverse(v interface{}) ([]interface{}, error) {
|
||||
tp := reflect.TypeOf(v).Kind()
|
||||
switch tp {
|
||||
case reflect.Slice, reflect.Array:
|
||||
l2 := reflect.ValueOf(v)
|
||||
|
||||
l := l2.Len()
|
||||
// We do not sort in place because the incoming array should not be altered.
|
||||
nl := make([]interface{}, l)
|
||||
for i := 0; i < l; i++ {
|
||||
nl[l-i-1] = l2.Index(i).Interface()
|
||||
}
|
||||
|
||||
return nl, nil
|
||||
default:
|
||||
return nil, fmt.Errorf("Cannot find reverse on type %s", tp)
|
||||
}
|
||||
}
|
||||
|
||||
func compact(list interface{}) []interface{} {
|
||||
l, err := mustCompact(list)
|
||||
if err != nil {
|
||||
panic(err)
|
||||
}
|
||||
|
||||
return l
|
||||
}
|
||||
|
||||
func mustCompact(list interface{}) ([]interface{}, error) {
|
||||
tp := reflect.TypeOf(list).Kind()
|
||||
switch tp {
|
||||
case reflect.Slice, reflect.Array:
|
||||
l2 := reflect.ValueOf(list)
|
||||
|
||||
l := l2.Len()
|
||||
nl := []interface{}{}
|
||||
var item interface{}
|
||||
for i := 0; i < l; i++ {
|
||||
item = l2.Index(i).Interface()
|
||||
if !empty(item) {
|
||||
nl = append(nl, item)
|
||||
}
|
||||
}
|
||||
|
||||
return nl, nil
|
||||
default:
|
||||
return nil, fmt.Errorf("Cannot compact on type %s", tp)
|
||||
}
|
||||
}
|
||||
|
||||
func uniq(list interface{}) []interface{} {
|
||||
l, err := mustUniq(list)
|
||||
if err != nil {
|
||||
panic(err)
|
||||
}
|
||||
|
||||
return l
|
||||
}
|
||||
|
||||
func mustUniq(list interface{}) ([]interface{}, error) {
|
||||
tp := reflect.TypeOf(list).Kind()
|
||||
switch tp {
|
||||
case reflect.Slice, reflect.Array:
|
||||
l2 := reflect.ValueOf(list)
|
||||
|
||||
l := l2.Len()
|
||||
dest := []interface{}{}
|
||||
var item interface{}
|
||||
for i := 0; i < l; i++ {
|
||||
item = l2.Index(i).Interface()
|
||||
if !inList(dest, item) {
|
||||
dest = append(dest, item)
|
||||
}
|
||||
}
|
||||
|
||||
return dest, nil
|
||||
default:
|
||||
return nil, fmt.Errorf("Cannot find uniq on type %s", tp)
|
||||
}
|
||||
}
|
||||
|
||||
func inList(haystack []interface{}, needle interface{}) bool {
|
||||
for _, h := range haystack {
|
||||
if reflect.DeepEqual(needle, h) {
|
||||
return true
|
||||
}
|
||||
}
|
||||
return false
|
||||
}
|
||||
|
||||
func without(list interface{}, omit ...interface{}) []interface{} {
|
||||
l, err := mustWithout(list, omit...)
|
||||
if err != nil {
|
||||
panic(err)
|
||||
}
|
||||
|
||||
return l
|
||||
}
|
||||
|
||||
func mustWithout(list interface{}, omit ...interface{}) ([]interface{}, error) {
|
||||
tp := reflect.TypeOf(list).Kind()
|
||||
switch tp {
|
||||
case reflect.Slice, reflect.Array:
|
||||
l2 := reflect.ValueOf(list)
|
||||
|
||||
l := l2.Len()
|
||||
res := []interface{}{}
|
||||
var item interface{}
|
||||
for i := 0; i < l; i++ {
|
||||
item = l2.Index(i).Interface()
|
||||
if !inList(omit, item) {
|
||||
res = append(res, item)
|
||||
}
|
||||
}
|
||||
|
||||
return res, nil
|
||||
default:
|
||||
return nil, fmt.Errorf("Cannot find without on type %s", tp)
|
||||
}
|
||||
}
|
||||
|
||||
func has(needle interface{}, haystack interface{}) bool {
|
||||
l, err := mustHas(needle, haystack)
|
||||
if err != nil {
|
||||
panic(err)
|
||||
}
|
||||
|
||||
return l
|
||||
}
|
||||
|
||||
func mustHas(needle interface{}, haystack interface{}) (bool, error) {
|
||||
if haystack == nil {
|
||||
return false, nil
|
||||
}
|
||||
tp := reflect.TypeOf(haystack).Kind()
|
||||
switch tp {
|
||||
case reflect.Slice, reflect.Array:
|
||||
l2 := reflect.ValueOf(haystack)
|
||||
var item interface{}
|
||||
l := l2.Len()
|
||||
for i := 0; i < l; i++ {
|
||||
item = l2.Index(i).Interface()
|
||||
if reflect.DeepEqual(needle, item) {
|
||||
return true, nil
|
||||
}
|
||||
}
|
||||
|
||||
return false, nil
|
||||
default:
|
||||
return false, fmt.Errorf("Cannot find has on type %s", tp)
|
||||
}
|
||||
}
|
||||
|
||||
// $list := [1, 2, 3, 4, 5]
|
||||
// slice $list -> list[0:5] = list[:]
|
||||
// slice $list 0 3 -> list[0:3] = list[:3]
|
||||
// slice $list 3 5 -> list[3:5]
|
||||
// slice $list 3 -> list[3:5] = list[3:]
|
||||
func slice(list interface{}, indices ...interface{}) interface{} {
|
||||
l, err := mustSlice(list, indices...)
|
||||
if err != nil {
|
||||
panic(err)
|
||||
}
|
||||
|
||||
return l
|
||||
}
|
||||
|
||||
func mustSlice(list interface{}, indices ...interface{}) (interface{}, error) {
|
||||
tp := reflect.TypeOf(list).Kind()
|
||||
switch tp {
|
||||
case reflect.Slice, reflect.Array:
|
||||
l2 := reflect.ValueOf(list)
|
||||
|
||||
l := l2.Len()
|
||||
if l == 0 {
|
||||
return nil, nil
|
||||
}
|
||||
|
||||
var start, end int
|
||||
if len(indices) > 0 {
|
||||
start = toInt(indices[0])
|
||||
}
|
||||
if len(indices) < 2 {
|
||||
end = l
|
||||
} else {
|
||||
end = toInt(indices[1])
|
||||
}
|
||||
|
||||
return l2.Slice(start, end).Interface(), nil
|
||||
default:
|
||||
return nil, fmt.Errorf("list should be type of slice or array but %s", tp)
|
||||
}
|
||||
}
|
||||
|
||||
func concat(lists ...interface{}) interface{} {
|
||||
var res []interface{}
|
||||
for _, list := range lists {
|
||||
tp := reflect.TypeOf(list).Kind()
|
||||
switch tp {
|
||||
case reflect.Slice, reflect.Array:
|
||||
l2 := reflect.ValueOf(list)
|
||||
for i := 0; i < l2.Len(); i++ {
|
||||
res = append(res, l2.Index(i).Interface())
|
||||
}
|
||||
default:
|
||||
panic(fmt.Sprintf("Cannot concat type %s as list", tp))
|
||||
}
|
||||
}
|
||||
return res
|
||||
}
|
||||
12
vendor/github.com/Masterminds/sprig/v3/network.go
generated
vendored
Normal file
12
vendor/github.com/Masterminds/sprig/v3/network.go
generated
vendored
Normal file
|
|
@ -0,0 +1,12 @@
|
|||
package sprig
|
||||
|
||||
import (
|
||||
"math/rand"
|
||||
"net"
|
||||
)
|
||||
|
||||
func getHostByName(name string) string {
|
||||
addrs, _ := net.LookupHost(name)
|
||||
//TODO: add error handing when release v3 comes out
|
||||
return addrs[rand.Intn(len(addrs))]
|
||||
}
|
||||
186
vendor/github.com/Masterminds/sprig/v3/numeric.go
generated
vendored
Normal file
186
vendor/github.com/Masterminds/sprig/v3/numeric.go
generated
vendored
Normal file
|
|
@ -0,0 +1,186 @@
|
|||
package sprig
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"math"
|
||||
"strconv"
|
||||
"strings"
|
||||
|
||||
"github.com/spf13/cast"
|
||||
"github.com/shopspring/decimal"
|
||||
)
|
||||
|
||||
// toFloat64 converts 64-bit floats
|
||||
func toFloat64(v interface{}) float64 {
|
||||
return cast.ToFloat64(v)
|
||||
}
|
||||
|
||||
func toInt(v interface{}) int {
|
||||
return cast.ToInt(v)
|
||||
}
|
||||
|
||||
// toInt64 converts integer types to 64-bit integers
|
||||
func toInt64(v interface{}) int64 {
|
||||
return cast.ToInt64(v)
|
||||
}
|
||||
|
||||
func max(a interface{}, i ...interface{}) int64 {
|
||||
aa := toInt64(a)
|
||||
for _, b := range i {
|
||||
bb := toInt64(b)
|
||||
if bb > aa {
|
||||
aa = bb
|
||||
}
|
||||
}
|
||||
return aa
|
||||
}
|
||||
|
||||
func maxf(a interface{}, i ...interface{}) float64 {
|
||||
aa := toFloat64(a)
|
||||
for _, b := range i {
|
||||
bb := toFloat64(b)
|
||||
aa = math.Max(aa, bb)
|
||||
}
|
||||
return aa
|
||||
}
|
||||
|
||||
func min(a interface{}, i ...interface{}) int64 {
|
||||
aa := toInt64(a)
|
||||
for _, b := range i {
|
||||
bb := toInt64(b)
|
||||
if bb < aa {
|
||||
aa = bb
|
||||
}
|
||||
}
|
||||
return aa
|
||||
}
|
||||
|
||||
func minf(a interface{}, i ...interface{}) float64 {
|
||||
aa := toFloat64(a)
|
||||
for _, b := range i {
|
||||
bb := toFloat64(b)
|
||||
aa = math.Min(aa, bb)
|
||||
}
|
||||
return aa
|
||||
}
|
||||
|
||||
func until(count int) []int {
|
||||
step := 1
|
||||
if count < 0 {
|
||||
step = -1
|
||||
}
|
||||
return untilStep(0, count, step)
|
||||
}
|
||||
|
||||
func untilStep(start, stop, step int) []int {
|
||||
v := []int{}
|
||||
|
||||
if stop < start {
|
||||
if step >= 0 {
|
||||
return v
|
||||
}
|
||||
for i := start; i > stop; i += step {
|
||||
v = append(v, i)
|
||||
}
|
||||
return v
|
||||
}
|
||||
|
||||
if step <= 0 {
|
||||
return v
|
||||
}
|
||||
for i := start; i < stop; i += step {
|
||||
v = append(v, i)
|
||||
}
|
||||
return v
|
||||
}
|
||||
|
||||
func floor(a interface{}) float64 {
|
||||
aa := toFloat64(a)
|
||||
return math.Floor(aa)
|
||||
}
|
||||
|
||||
func ceil(a interface{}) float64 {
|
||||
aa := toFloat64(a)
|
||||
return math.Ceil(aa)
|
||||
}
|
||||
|
||||
func round(a interface{}, p int, rOpt ...float64) float64 {
|
||||
roundOn := .5
|
||||
if len(rOpt) > 0 {
|
||||
roundOn = rOpt[0]
|
||||
}
|
||||
val := toFloat64(a)
|
||||
places := toFloat64(p)
|
||||
|
||||
var round float64
|
||||
pow := math.Pow(10, places)
|
||||
digit := pow * val
|
||||
_, div := math.Modf(digit)
|
||||
if div >= roundOn {
|
||||
round = math.Ceil(digit)
|
||||
} else {
|
||||
round = math.Floor(digit)
|
||||
}
|
||||
return round / pow
|
||||
}
|
||||
|
||||
// converts unix octal to decimal
|
||||
func toDecimal(v interface{}) int64 {
|
||||
result, err := strconv.ParseInt(fmt.Sprint(v), 8, 64)
|
||||
if err != nil {
|
||||
return 0
|
||||
}
|
||||
return result
|
||||
}
|
||||
|
||||
func seq(params ...int) string {
|
||||
increment := 1
|
||||
switch len(params) {
|
||||
case 0:
|
||||
return ""
|
||||
case 1:
|
||||
start := 1
|
||||
end := params[0]
|
||||
if end < start {
|
||||
increment = -1
|
||||
}
|
||||
return intArrayToString(untilStep(start, end+increment, increment), " ")
|
||||
case 3:
|
||||
start := params[0]
|
||||
end := params[2]
|
||||
step := params[1]
|
||||
if end < start {
|
||||
increment = -1
|
||||
if step > 0 {
|
||||
return ""
|
||||
}
|
||||
}
|
||||
return intArrayToString(untilStep(start, end+increment, step), " ")
|
||||
case 2:
|
||||
start := params[0]
|
||||
end := params[1]
|
||||
step := 1
|
||||
if end < start {
|
||||
step = -1
|
||||
}
|
||||
return intArrayToString(untilStep(start, end+step, step), " ")
|
||||
default:
|
||||
return ""
|
||||
}
|
||||
}
|
||||
|
||||
func intArrayToString(slice []int, delimeter string) string {
|
||||
return strings.Trim(strings.Join(strings.Fields(fmt.Sprint(slice)), delimeter), "[]")
|
||||
}
|
||||
|
||||
// performs a float and subsequent decimal.Decimal conversion on inputs,
|
||||
// and iterates through a and b executing the mathmetical operation f
|
||||
func execDecimalOp(a interface{}, b []interface{}, f func(d1, d2 decimal.Decimal) decimal.Decimal) float64 {
|
||||
prt := decimal.NewFromFloat(toFloat64(a))
|
||||
for _, x := range b {
|
||||
dx := decimal.NewFromFloat(toFloat64(x))
|
||||
prt = f(prt, dx)
|
||||
}
|
||||
rslt, _ := prt.Float64()
|
||||
return rslt
|
||||
}
|
||||
28
vendor/github.com/Masterminds/sprig/v3/reflect.go
generated
vendored
Normal file
28
vendor/github.com/Masterminds/sprig/v3/reflect.go
generated
vendored
Normal file
|
|
@ -0,0 +1,28 @@
|
|||
package sprig
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"reflect"
|
||||
)
|
||||
|
||||
// typeIs returns true if the src is the type named in target.
|
||||
func typeIs(target string, src interface{}) bool {
|
||||
return target == typeOf(src)
|
||||
}
|
||||
|
||||
func typeIsLike(target string, src interface{}) bool {
|
||||
t := typeOf(src)
|
||||
return target == t || "*"+target == t
|
||||
}
|
||||
|
||||
func typeOf(src interface{}) string {
|
||||
return fmt.Sprintf("%T", src)
|
||||
}
|
||||
|
||||
func kindIs(target string, src interface{}) bool {
|
||||
return target == kindOf(src)
|
||||
}
|
||||
|
||||
func kindOf(src interface{}) string {
|
||||
return reflect.ValueOf(src).Kind().String()
|
||||
}
|
||||
83
vendor/github.com/Masterminds/sprig/v3/regex.go
generated
vendored
Normal file
83
vendor/github.com/Masterminds/sprig/v3/regex.go
generated
vendored
Normal file
|
|
@ -0,0 +1,83 @@
|
|||
package sprig
|
||||
|
||||
import (
|
||||
"regexp"
|
||||
)
|
||||
|
||||
func regexMatch(regex string, s string) bool {
|
||||
match, _ := regexp.MatchString(regex, s)
|
||||
return match
|
||||
}
|
||||
|
||||
func mustRegexMatch(regex string, s string) (bool, error) {
|
||||
return regexp.MatchString(regex, s)
|
||||
}
|
||||
|
||||
func regexFindAll(regex string, s string, n int) []string {
|
||||
r := regexp.MustCompile(regex)
|
||||
return r.FindAllString(s, n)
|
||||
}
|
||||
|
||||
func mustRegexFindAll(regex string, s string, n int) ([]string, error) {
|
||||
r, err := regexp.Compile(regex)
|
||||
if err != nil {
|
||||
return []string{}, err
|
||||
}
|
||||
return r.FindAllString(s, n), nil
|
||||
}
|
||||
|
||||
func regexFind(regex string, s string) string {
|
||||
r := regexp.MustCompile(regex)
|
||||
return r.FindString(s)
|
||||
}
|
||||
|
||||
func mustRegexFind(regex string, s string) (string, error) {
|
||||
r, err := regexp.Compile(regex)
|
||||
if err != nil {
|
||||
return "", err
|
||||
}
|
||||
return r.FindString(s), nil
|
||||
}
|
||||
|
||||
func regexReplaceAll(regex string, s string, repl string) string {
|
||||
r := regexp.MustCompile(regex)
|
||||
return r.ReplaceAllString(s, repl)
|
||||
}
|
||||
|
||||
func mustRegexReplaceAll(regex string, s string, repl string) (string, error) {
|
||||
r, err := regexp.Compile(regex)
|
||||
if err != nil {
|
||||
return "", err
|
||||
}
|
||||
return r.ReplaceAllString(s, repl), nil
|
||||
}
|
||||
|
||||
func regexReplaceAllLiteral(regex string, s string, repl string) string {
|
||||
r := regexp.MustCompile(regex)
|
||||
return r.ReplaceAllLiteralString(s, repl)
|
||||
}
|
||||
|
||||
func mustRegexReplaceAllLiteral(regex string, s string, repl string) (string, error) {
|
||||
r, err := regexp.Compile(regex)
|
||||
if err != nil {
|
||||
return "", err
|
||||
}
|
||||
return r.ReplaceAllLiteralString(s, repl), nil
|
||||
}
|
||||
|
||||
func regexSplit(regex string, s string, n int) []string {
|
||||
r := regexp.MustCompile(regex)
|
||||
return r.Split(s, n)
|
||||
}
|
||||
|
||||
func mustRegexSplit(regex string, s string, n int) ([]string, error) {
|
||||
r, err := regexp.Compile(regex)
|
||||
if err != nil {
|
||||
return []string{}, err
|
||||
}
|
||||
return r.Split(s, n), nil
|
||||
}
|
||||
|
||||
func regexQuoteMeta(s string) string {
|
||||
return regexp.QuoteMeta(s)
|
||||
}
|
||||
23
vendor/github.com/Masterminds/sprig/v3/semver.go
generated
vendored
Normal file
23
vendor/github.com/Masterminds/sprig/v3/semver.go
generated
vendored
Normal file
|
|
@ -0,0 +1,23 @@
|
|||
package sprig
|
||||
|
||||
import (
|
||||
sv2 "github.com/Masterminds/semver/v3"
|
||||
)
|
||||
|
||||
func semverCompare(constraint, version string) (bool, error) {
|
||||
c, err := sv2.NewConstraint(constraint)
|
||||
if err != nil {
|
||||
return false, err
|
||||
}
|
||||
|
||||
v, err := sv2.NewVersion(version)
|
||||
if err != nil {
|
||||
return false, err
|
||||
}
|
||||
|
||||
return c.Check(v), nil
|
||||
}
|
||||
|
||||
func semver(version string) (*sv2.Version, error) {
|
||||
return sv2.NewVersion(version)
|
||||
}
|
||||
236
vendor/github.com/Masterminds/sprig/v3/strings.go
generated
vendored
Normal file
236
vendor/github.com/Masterminds/sprig/v3/strings.go
generated
vendored
Normal file
|
|
@ -0,0 +1,236 @@
|
|||
package sprig
|
||||
|
||||
import (
|
||||
"encoding/base32"
|
||||
"encoding/base64"
|
||||
"fmt"
|
||||
"reflect"
|
||||
"strconv"
|
||||
"strings"
|
||||
|
||||
util "github.com/Masterminds/goutils"
|
||||
)
|
||||
|
||||
func base64encode(v string) string {
|
||||
return base64.StdEncoding.EncodeToString([]byte(v))
|
||||
}
|
||||
|
||||
func base64decode(v string) string {
|
||||
data, err := base64.StdEncoding.DecodeString(v)
|
||||
if err != nil {
|
||||
return err.Error()
|
||||
}
|
||||
return string(data)
|
||||
}
|
||||
|
||||
func base32encode(v string) string {
|
||||
return base32.StdEncoding.EncodeToString([]byte(v))
|
||||
}
|
||||
|
||||
func base32decode(v string) string {
|
||||
data, err := base32.StdEncoding.DecodeString(v)
|
||||
if err != nil {
|
||||
return err.Error()
|
||||
}
|
||||
return string(data)
|
||||
}
|
||||
|
||||
func abbrev(width int, s string) string {
|
||||
if width < 4 {
|
||||
return s
|
||||
}
|
||||
r, _ := util.Abbreviate(s, width)
|
||||
return r
|
||||
}
|
||||
|
||||
func abbrevboth(left, right int, s string) string {
|
||||
if right < 4 || left > 0 && right < 7 {
|
||||
return s
|
||||
}
|
||||
r, _ := util.AbbreviateFull(s, left, right)
|
||||
return r
|
||||
}
|
||||
func initials(s string) string {
|
||||
// Wrap this just to eliminate the var args, which templates don't do well.
|
||||
return util.Initials(s)
|
||||
}
|
||||
|
||||
func randAlphaNumeric(count int) string {
|
||||
// It is not possible, it appears, to actually generate an error here.
|
||||
r, _ := util.CryptoRandomAlphaNumeric(count)
|
||||
return r
|
||||
}
|
||||
|
||||
func randAlpha(count int) string {
|
||||
r, _ := util.CryptoRandomAlphabetic(count)
|
||||
return r
|
||||
}
|
||||
|
||||
func randAscii(count int) string {
|
||||
r, _ := util.CryptoRandomAscii(count)
|
||||
return r
|
||||
}
|
||||
|
||||
func randNumeric(count int) string {
|
||||
r, _ := util.CryptoRandomNumeric(count)
|
||||
return r
|
||||
}
|
||||
|
||||
func untitle(str string) string {
|
||||
return util.Uncapitalize(str)
|
||||
}
|
||||
|
||||
func quote(str ...interface{}) string {
|
||||
out := make([]string, 0, len(str))
|
||||
for _, s := range str {
|
||||
if s != nil {
|
||||
out = append(out, fmt.Sprintf("%q", strval(s)))
|
||||
}
|
||||
}
|
||||
return strings.Join(out, " ")
|
||||
}
|
||||
|
||||
func squote(str ...interface{}) string {
|
||||
out := make([]string, 0, len(str))
|
||||
for _, s := range str {
|
||||
if s != nil {
|
||||
out = append(out, fmt.Sprintf("'%v'", s))
|
||||
}
|
||||
}
|
||||
return strings.Join(out, " ")
|
||||
}
|
||||
|
||||
func cat(v ...interface{}) string {
|
||||
v = removeNilElements(v)
|
||||
r := strings.TrimSpace(strings.Repeat("%v ", len(v)))
|
||||
return fmt.Sprintf(r, v...)
|
||||
}
|
||||
|
||||
func indent(spaces int, v string) string {
|
||||
pad := strings.Repeat(" ", spaces)
|
||||
return pad + strings.Replace(v, "\n", "\n"+pad, -1)
|
||||
}
|
||||
|
||||
func nindent(spaces int, v string) string {
|
||||
return "\n" + indent(spaces, v)
|
||||
}
|
||||
|
||||
func replace(old, new, src string) string {
|
||||
return strings.Replace(src, old, new, -1)
|
||||
}
|
||||
|
||||
func plural(one, many string, count int) string {
|
||||
if count == 1 {
|
||||
return one
|
||||
}
|
||||
return many
|
||||
}
|
||||
|
||||
func strslice(v interface{}) []string {
|
||||
switch v := v.(type) {
|
||||
case []string:
|
||||
return v
|
||||
case []interface{}:
|
||||
b := make([]string, 0, len(v))
|
||||
for _, s := range v {
|
||||
if s != nil {
|
||||
b = append(b, strval(s))
|
||||
}
|
||||
}
|
||||
return b
|
||||
default:
|
||||
val := reflect.ValueOf(v)
|
||||
switch val.Kind() {
|
||||
case reflect.Array, reflect.Slice:
|
||||
l := val.Len()
|
||||
b := make([]string, 0, l)
|
||||
for i := 0; i < l; i++ {
|
||||
value := val.Index(i).Interface()
|
||||
if value != nil {
|
||||
b = append(b, strval(value))
|
||||
}
|
||||
}
|
||||
return b
|
||||
default:
|
||||
if v == nil {
|
||||
return []string{}
|
||||
}
|
||||
|
||||
return []string{strval(v)}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
func removeNilElements(v []interface{}) []interface{} {
|
||||
newSlice := make([]interface{}, 0, len(v))
|
||||
for _, i := range v {
|
||||
if i != nil {
|
||||
newSlice = append(newSlice, i)
|
||||
}
|
||||
}
|
||||
return newSlice
|
||||
}
|
||||
|
||||
func strval(v interface{}) string {
|
||||
switch v := v.(type) {
|
||||
case string:
|
||||
return v
|
||||
case []byte:
|
||||
return string(v)
|
||||
case error:
|
||||
return v.Error()
|
||||
case fmt.Stringer:
|
||||
return v.String()
|
||||
default:
|
||||
return fmt.Sprintf("%v", v)
|
||||
}
|
||||
}
|
||||
|
||||
func trunc(c int, s string) string {
|
||||
if c < 0 && len(s)+c > 0 {
|
||||
return s[len(s)+c:]
|
||||
}
|
||||
if c >= 0 && len(s) > c {
|
||||
return s[:c]
|
||||
}
|
||||
return s
|
||||
}
|
||||
|
||||
func join(sep string, v interface{}) string {
|
||||
return strings.Join(strslice(v), sep)
|
||||
}
|
||||
|
||||
func split(sep, orig string) map[string]string {
|
||||
parts := strings.Split(orig, sep)
|
||||
res := make(map[string]string, len(parts))
|
||||
for i, v := range parts {
|
||||
res["_"+strconv.Itoa(i)] = v
|
||||
}
|
||||
return res
|
||||
}
|
||||
|
||||
func splitn(sep string, n int, orig string) map[string]string {
|
||||
parts := strings.SplitN(orig, sep, n)
|
||||
res := make(map[string]string, len(parts))
|
||||
for i, v := range parts {
|
||||
res["_"+strconv.Itoa(i)] = v
|
||||
}
|
||||
return res
|
||||
}
|
||||
|
||||
// substring creates a substring of the given string.
|
||||
//
|
||||
// If start is < 0, this calls string[:end].
|
||||
//
|
||||
// If start is >= 0 and end < 0 or end bigger than s length, this calls string[start:]
|
||||
//
|
||||
// Otherwise, this calls string[start, end].
|
||||
func substring(start, end int, s string) string {
|
||||
if start < 0 {
|
||||
return s[:end]
|
||||
}
|
||||
if end < 0 || end > len(s) {
|
||||
return s[start:]
|
||||
}
|
||||
return s[start:end]
|
||||
}
|
||||
66
vendor/github.com/Masterminds/sprig/v3/url.go
generated
vendored
Normal file
66
vendor/github.com/Masterminds/sprig/v3/url.go
generated
vendored
Normal file
|
|
@ -0,0 +1,66 @@
|
|||
package sprig
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"net/url"
|
||||
"reflect"
|
||||
)
|
||||
|
||||
func dictGetOrEmpty(dict map[string]interface{}, key string) string {
|
||||
value, ok := dict[key]
|
||||
if !ok {
|
||||
return ""
|
||||
}
|
||||
tp := reflect.TypeOf(value).Kind()
|
||||
if tp != reflect.String {
|
||||
panic(fmt.Sprintf("unable to parse %s key, must be of type string, but %s found", key, tp.String()))
|
||||
}
|
||||
return reflect.ValueOf(value).String()
|
||||
}
|
||||
|
||||
// parses given URL to return dict object
|
||||
func urlParse(v string) map[string]interface{} {
|
||||
dict := map[string]interface{}{}
|
||||
parsedURL, err := url.Parse(v)
|
||||
if err != nil {
|
||||
panic(fmt.Sprintf("unable to parse url: %s", err))
|
||||
}
|
||||
dict["scheme"] = parsedURL.Scheme
|
||||
dict["host"] = parsedURL.Host
|
||||
dict["hostname"] = parsedURL.Hostname()
|
||||
dict["path"] = parsedURL.Path
|
||||
dict["query"] = parsedURL.RawQuery
|
||||
dict["opaque"] = parsedURL.Opaque
|
||||
dict["fragment"] = parsedURL.Fragment
|
||||
if parsedURL.User != nil {
|
||||
dict["userinfo"] = parsedURL.User.String()
|
||||
} else {
|
||||
dict["userinfo"] = ""
|
||||
}
|
||||
|
||||
return dict
|
||||
}
|
||||
|
||||
// join given dict to URL string
|
||||
func urlJoin(d map[string]interface{}) string {
|
||||
resURL := url.URL{
|
||||
Scheme: dictGetOrEmpty(d, "scheme"),
|
||||
Host: dictGetOrEmpty(d, "host"),
|
||||
Path: dictGetOrEmpty(d, "path"),
|
||||
RawQuery: dictGetOrEmpty(d, "query"),
|
||||
Opaque: dictGetOrEmpty(d, "opaque"),
|
||||
Fragment: dictGetOrEmpty(d, "fragment"),
|
||||
}
|
||||
userinfo := dictGetOrEmpty(d, "userinfo")
|
||||
var user *url.Userinfo
|
||||
if userinfo != "" {
|
||||
tempURL, err := url.Parse(fmt.Sprintf("proto://%s@host", userinfo))
|
||||
if err != nil {
|
||||
panic(fmt.Sprintf("unable to parse userinfo in dict: %s", err))
|
||||
}
|
||||
user = tempURL.User
|
||||
}
|
||||
|
||||
resURL.User = user
|
||||
return resURL.String()
|
||||
}
|
||||
15
vendor/github.com/asaskevich/govalidator/.gitignore
generated
vendored
Normal file
15
vendor/github.com/asaskevich/govalidator/.gitignore
generated
vendored
Normal file
|
|
@ -0,0 +1,15 @@
|
|||
bin/
|
||||
.idea/
|
||||
# Binaries for programs and plugins
|
||||
*.exe
|
||||
*.exe~
|
||||
*.dll
|
||||
*.so
|
||||
*.dylib
|
||||
|
||||
# Test binary, built with `go test -c`
|
||||
*.test
|
||||
|
||||
# Output of the go coverage tool, specifically when used with LiteIDE
|
||||
*.out
|
||||
|
||||
12
vendor/github.com/asaskevich/govalidator/.travis.yml
generated
vendored
Normal file
12
vendor/github.com/asaskevich/govalidator/.travis.yml
generated
vendored
Normal file
|
|
@ -0,0 +1,12 @@
|
|||
language: go
|
||||
dist: xenial
|
||||
go:
|
||||
- '1.10'
|
||||
- '1.11'
|
||||
- '1.12'
|
||||
- '1.13'
|
||||
- 'tip'
|
||||
|
||||
script:
|
||||
- go test -coverpkg=./... -coverprofile=coverage.info -timeout=5s
|
||||
- bash <(curl -s https://codecov.io/bash)
|
||||
43
vendor/github.com/asaskevich/govalidator/CODE_OF_CONDUCT.md
generated
vendored
Normal file
43
vendor/github.com/asaskevich/govalidator/CODE_OF_CONDUCT.md
generated
vendored
Normal file
|
|
@ -0,0 +1,43 @@
|
|||
# Contributor Code of Conduct
|
||||
|
||||
This project adheres to [The Code Manifesto](http://codemanifesto.com)
|
||||
as its guidelines for contributor interactions.
|
||||
|
||||
## The Code Manifesto
|
||||
|
||||
We want to work in an ecosystem that empowers developers to reach their
|
||||
potential — one that encourages growth and effective collaboration. A space
|
||||
that is safe for all.
|
||||
|
||||
A space such as this benefits everyone that participates in it. It encourages
|
||||
new developers to enter our field. It is through discussion and collaboration
|
||||
that we grow, and through growth that we improve.
|
||||
|
||||
In the effort to create such a place, we hold to these values:
|
||||
|
||||
1. **Discrimination limits us.** This includes discrimination on the basis of
|
||||
race, gender, sexual orientation, gender identity, age, nationality,
|
||||
technology and any other arbitrary exclusion of a group of people.
|
||||
2. **Boundaries honor us.** Your comfort levels are not everyone’s comfort
|
||||
levels. Remember that, and if brought to your attention, heed it.
|
||||
3. **We are our biggest assets.** None of us were born masters of our trade.
|
||||
Each of us has been helped along the way. Return that favor, when and where
|
||||
you can.
|
||||
4. **We are resources for the future.** As an extension of #3, share what you
|
||||
know. Make yourself a resource to help those that come after you.
|
||||
5. **Respect defines us.** Treat others as you wish to be treated. Make your
|
||||
discussions, criticisms and debates from a position of respectfulness. Ask
|
||||
yourself, is it true? Is it necessary? Is it constructive? Anything less is
|
||||
unacceptable.
|
||||
6. **Reactions require grace.** Angry responses are valid, but abusive language
|
||||
and vindictive actions are toxic. When something happens that offends you,
|
||||
handle it assertively, but be respectful. Escalate reasonably, and try to
|
||||
allow the offender an opportunity to explain themselves, and possibly
|
||||
correct the issue.
|
||||
7. **Opinions are just that: opinions.** Each and every one of us, due to our
|
||||
background and upbringing, have varying opinions. That is perfectly
|
||||
acceptable. Remember this: if you respect your own opinions, you should
|
||||
respect the opinions of others.
|
||||
8. **To err is human.** You might not intend it, but mistakes do happen and
|
||||
contribute to build experience. Tolerate honest mistakes, and don't
|
||||
hesitate to apologize if you make one yourself.
|
||||
63
vendor/github.com/asaskevich/govalidator/CONTRIBUTING.md
generated
vendored
Normal file
63
vendor/github.com/asaskevich/govalidator/CONTRIBUTING.md
generated
vendored
Normal file
|
|
@ -0,0 +1,63 @@
|
|||
#### Support
|
||||
If you do have a contribution to the package, feel free to create a Pull Request or an Issue.
|
||||
|
||||
#### What to contribute
|
||||
If you don't know what to do, there are some features and functions that need to be done
|
||||
|
||||
- [ ] Refactor code
|
||||
- [ ] Edit docs and [README](https://github.com/asaskevich/govalidator/README.md): spellcheck, grammar and typo check
|
||||
- [ ] Create actual list of contributors and projects that currently using this package
|
||||
- [ ] Resolve [issues and bugs](https://github.com/asaskevich/govalidator/issues)
|
||||
- [ ] Update actual [list of functions](https://github.com/asaskevich/govalidator#list-of-functions)
|
||||
- [ ] Update [list of validators](https://github.com/asaskevich/govalidator#validatestruct-2) that available for `ValidateStruct` and add new
|
||||
- [ ] Implement new validators: `IsFQDN`, `IsIMEI`, `IsPostalCode`, `IsISIN`, `IsISRC` etc
|
||||
- [x] Implement [validation by maps](https://github.com/asaskevich/govalidator/issues/224)
|
||||
- [ ] Implement fuzzing testing
|
||||
- [ ] Implement some struct/map/array utilities
|
||||
- [ ] Implement map/array validation
|
||||
- [ ] Implement benchmarking
|
||||
- [ ] Implement batch of examples
|
||||
- [ ] Look at forks for new features and fixes
|
||||
|
||||
#### Advice
|
||||
Feel free to create what you want, but keep in mind when you implement new features:
|
||||
- Code must be clear and readable, names of variables/constants clearly describes what they are doing
|
||||
- Public functions must be documented and described in source file and added to README.md to the list of available functions
|
||||
- There are must be unit-tests for any new functions and improvements
|
||||
|
||||
## Financial contributions
|
||||
|
||||
We also welcome financial contributions in full transparency on our [open collective](https://opencollective.com/govalidator).
|
||||
Anyone can file an expense. If the expense makes sense for the development of the community, it will be "merged" in the ledger of our open collective by the core contributors and the person who filed the expense will be reimbursed.
|
||||
|
||||
|
||||
## Credits
|
||||
|
||||
|
||||
### Contributors
|
||||
|
||||
Thank you to all the people who have already contributed to govalidator!
|
||||
<a href="https://github.com/asaskevich/govalidator/graphs/contributors"><img src="https://opencollective.com/govalidator/contributors.svg?width=890" /></a>
|
||||
|
||||
|
||||
### Backers
|
||||
|
||||
Thank you to all our backers! [[Become a backer](https://opencollective.com/govalidator#backer)]
|
||||
|
||||
<a href="https://opencollective.com/govalidator#backers" target="_blank"><img src="https://opencollective.com/govalidator/backers.svg?width=890"></a>
|
||||
|
||||
|
||||
### Sponsors
|
||||
|
||||
Thank you to all our sponsors! (please ask your company to also support this open source project by [becoming a sponsor](https://opencollective.com/govalidator#sponsor))
|
||||
|
||||
<a href="https://opencollective.com/govalidator/sponsor/0/website" target="_blank"><img src="https://opencollective.com/govalidator/sponsor/0/avatar.svg"></a>
|
||||
<a href="https://opencollective.com/govalidator/sponsor/1/website" target="_blank"><img src="https://opencollective.com/govalidator/sponsor/1/avatar.svg"></a>
|
||||
<a href="https://opencollective.com/govalidator/sponsor/2/website" target="_blank"><img src="https://opencollective.com/govalidator/sponsor/2/avatar.svg"></a>
|
||||
<a href="https://opencollective.com/govalidator/sponsor/3/website" target="_blank"><img src="https://opencollective.com/govalidator/sponsor/3/avatar.svg"></a>
|
||||
<a href="https://opencollective.com/govalidator/sponsor/4/website" target="_blank"><img src="https://opencollective.com/govalidator/sponsor/4/avatar.svg"></a>
|
||||
<a href="https://opencollective.com/govalidator/sponsor/5/website" target="_blank"><img src="https://opencollective.com/govalidator/sponsor/5/avatar.svg"></a>
|
||||
<a href="https://opencollective.com/govalidator/sponsor/6/website" target="_blank"><img src="https://opencollective.com/govalidator/sponsor/6/avatar.svg"></a>
|
||||
<a href="https://opencollective.com/govalidator/sponsor/7/website" target="_blank"><img src="https://opencollective.com/govalidator/sponsor/7/avatar.svg"></a>
|
||||
<a href="https://opencollective.com/govalidator/sponsor/8/website" target="_blank"><img src="https://opencollective.com/govalidator/sponsor/8/avatar.svg"></a>
|
||||
<a href="https://opencollective.com/govalidator/sponsor/9/website" target="_blank"><img src="https://opencollective.com/govalidator/sponsor/9/avatar.svg"></a>
|
||||
21
vendor/github.com/asaskevich/govalidator/LICENSE
generated
vendored
Normal file
21
vendor/github.com/asaskevich/govalidator/LICENSE
generated
vendored
Normal file
|
|
@ -0,0 +1,21 @@
|
|||
The MIT License (MIT)
|
||||
|
||||
Copyright (c) 2014-2020 Alex Saskevich
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all
|
||||
copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
SOFTWARE.
|
||||
622
vendor/github.com/asaskevich/govalidator/README.md
generated
vendored
Normal file
622
vendor/github.com/asaskevich/govalidator/README.md
generated
vendored
Normal file
|
|
@ -0,0 +1,622 @@
|
|||
govalidator
|
||||
===========
|
||||
[](https://gitter.im/asaskevich/govalidator?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge) [](https://godoc.org/github.com/asaskevich/govalidator)
|
||||
[](https://travis-ci.org/asaskevich/govalidator)
|
||||
[](https://codecov.io/gh/asaskevich/govalidator) [](https://goreportcard.com/report/github.com/asaskevich/govalidator) [](http://go-search.org/view?id=github.com%2Fasaskevich%2Fgovalidator) [](#backers) [](#sponsors) [](https://app.fossa.io/projects/git%2Bgithub.com%2Fasaskevich%2Fgovalidator?ref=badge_shield)
|
||||
|
||||
A package of validators and sanitizers for strings, structs and collections. Based on [validator.js](https://github.com/chriso/validator.js).
|
||||
|
||||
#### Installation
|
||||
Make sure that Go is installed on your computer.
|
||||
Type the following command in your terminal:
|
||||
|
||||
go get github.com/asaskevich/govalidator
|
||||
|
||||
or you can get specified release of the package with `gopkg.in`:
|
||||
|
||||
go get gopkg.in/asaskevich/govalidator.v10
|
||||
|
||||
After it the package is ready to use.
|
||||
|
||||
|
||||
#### Import package in your project
|
||||
Add following line in your `*.go` file:
|
||||
```go
|
||||
import "github.com/asaskevich/govalidator"
|
||||
```
|
||||
If you are unhappy to use long `govalidator`, you can do something like this:
|
||||
```go
|
||||
import (
|
||||
valid "github.com/asaskevich/govalidator"
|
||||
)
|
||||
```
|
||||
|
||||
#### Activate behavior to require all fields have a validation tag by default
|
||||
`SetFieldsRequiredByDefault` causes validation to fail when struct fields do not include validations or are not explicitly marked as exempt (using `valid:"-"` or `valid:"email,optional"`). A good place to activate this is a package init function or the main() function.
|
||||
|
||||
`SetNilPtrAllowedByRequired` causes validation to pass when struct fields marked by `required` are set to nil. This is disabled by default for consistency, but some packages that need to be able to determine between `nil` and `zero value` state can use this. If disabled, both `nil` and `zero` values cause validation errors.
|
||||
|
||||
```go
|
||||
import "github.com/asaskevich/govalidator"
|
||||
|
||||
func init() {
|
||||
govalidator.SetFieldsRequiredByDefault(true)
|
||||
}
|
||||
```
|
||||
|
||||
Here's some code to explain it:
|
||||
```go
|
||||
// this struct definition will fail govalidator.ValidateStruct() (and the field values do not matter):
|
||||
type exampleStruct struct {
|
||||
Name string ``
|
||||
Email string `valid:"email"`
|
||||
}
|
||||
|
||||
// this, however, will only fail when Email is empty or an invalid email address:
|
||||
type exampleStruct2 struct {
|
||||
Name string `valid:"-"`
|
||||
Email string `valid:"email"`
|
||||
}
|
||||
|
||||
// lastly, this will only fail when Email is an invalid email address but not when it's empty:
|
||||
type exampleStruct2 struct {
|
||||
Name string `valid:"-"`
|
||||
Email string `valid:"email,optional"`
|
||||
}
|
||||
```
|
||||
|
||||
#### Recent breaking changes (see [#123](https://github.com/asaskevich/govalidator/pull/123))
|
||||
##### Custom validator function signature
|
||||
A context was added as the second parameter, for structs this is the object being validated – this makes dependent validation possible.
|
||||
```go
|
||||
import "github.com/asaskevich/govalidator"
|
||||
|
||||
// old signature
|
||||
func(i interface{}) bool
|
||||
|
||||
// new signature
|
||||
func(i interface{}, o interface{}) bool
|
||||
```
|
||||
|
||||
##### Adding a custom validator
|
||||
This was changed to prevent data races when accessing custom validators.
|
||||
```go
|
||||
import "github.com/asaskevich/govalidator"
|
||||
|
||||
// before
|
||||
govalidator.CustomTypeTagMap["customByteArrayValidator"] = func(i interface{}, o interface{}) bool {
|
||||
// ...
|
||||
}
|
||||
|
||||
// after
|
||||
govalidator.CustomTypeTagMap.Set("customByteArrayValidator", func(i interface{}, o interface{}) bool {
|
||||
// ...
|
||||
})
|
||||
```
|
||||
|
||||
#### List of functions:
|
||||
```go
|
||||
func Abs(value float64) float64
|
||||
func BlackList(str, chars string) string
|
||||
func ByteLength(str string, params ...string) bool
|
||||
func CamelCaseToUnderscore(str string) string
|
||||
func Contains(str, substring string) bool
|
||||
func Count(array []interface{}, iterator ConditionIterator) int
|
||||
func Each(array []interface{}, iterator Iterator)
|
||||
func ErrorByField(e error, field string) string
|
||||
func ErrorsByField(e error) map[string]string
|
||||
func Filter(array []interface{}, iterator ConditionIterator) []interface{}
|
||||
func Find(array []interface{}, iterator ConditionIterator) interface{}
|
||||
func GetLine(s string, index int) (string, error)
|
||||
func GetLines(s string) []string
|
||||
func HasLowerCase(str string) bool
|
||||
func HasUpperCase(str string) bool
|
||||
func HasWhitespace(str string) bool
|
||||
func HasWhitespaceOnly(str string) bool
|
||||
func InRange(value interface{}, left interface{}, right interface{}) bool
|
||||
func InRangeFloat32(value, left, right float32) bool
|
||||
func InRangeFloat64(value, left, right float64) bool
|
||||
func InRangeInt(value, left, right interface{}) bool
|
||||
func IsASCII(str string) bool
|
||||
func IsAlpha(str string) bool
|
||||
func IsAlphanumeric(str string) bool
|
||||
func IsBase64(str string) bool
|
||||
func IsByteLength(str string, min, max int) bool
|
||||
func IsCIDR(str string) bool
|
||||
func IsCRC32(str string) bool
|
||||
func IsCRC32b(str string) bool
|
||||
func IsCreditCard(str string) bool
|
||||
func IsDNSName(str string) bool
|
||||
func IsDataURI(str string) bool
|
||||
func IsDialString(str string) bool
|
||||
func IsDivisibleBy(str, num string) bool
|
||||
func IsEmail(str string) bool
|
||||
func IsExistingEmail(email string) bool
|
||||
func IsFilePath(str string) (bool, int)
|
||||
func IsFloat(str string) bool
|
||||
func IsFullWidth(str string) bool
|
||||
func IsHalfWidth(str string) bool
|
||||
func IsHash(str string, algorithm string) bool
|
||||
func IsHexadecimal(str string) bool
|
||||
func IsHexcolor(str string) bool
|
||||
func IsHost(str string) bool
|
||||
func IsIP(str string) bool
|
||||
func IsIPv4(str string) bool
|
||||
func IsIPv6(str string) bool
|
||||
func IsISBN(str string, version int) bool
|
||||
func IsISBN10(str string) bool
|
||||
func IsISBN13(str string) bool
|
||||
func IsISO3166Alpha2(str string) bool
|
||||
func IsISO3166Alpha3(str string) bool
|
||||
func IsISO4217(str string) bool
|
||||
func IsISO693Alpha2(str string) bool
|
||||
func IsISO693Alpha3b(str string) bool
|
||||
func IsIn(str string, params ...string) bool
|
||||
func IsInRaw(str string, params ...string) bool
|
||||
func IsInt(str string) bool
|
||||
func IsJSON(str string) bool
|
||||
func IsLatitude(str string) bool
|
||||
func IsLongitude(str string) bool
|
||||
func IsLowerCase(str string) bool
|
||||
func IsMAC(str string) bool
|
||||
func IsMD4(str string) bool
|
||||
func IsMD5(str string) bool
|
||||
func IsMagnetURI(str string) bool
|
||||
func IsMongoID(str string) bool
|
||||
func IsMultibyte(str string) bool
|
||||
func IsNatural(value float64) bool
|
||||
func IsNegative(value float64) bool
|
||||
func IsNonNegative(value float64) bool
|
||||
func IsNonPositive(value float64) bool
|
||||
func IsNotNull(str string) bool
|
||||
func IsNull(str string) bool
|
||||
func IsNumeric(str string) bool
|
||||
func IsPort(str string) bool
|
||||
func IsPositive(value float64) bool
|
||||
func IsPrintableASCII(str string) bool
|
||||
func IsRFC3339(str string) bool
|
||||
func IsRFC3339WithoutZone(str string) bool
|
||||
func IsRGBcolor(str string) bool
|
||||
func IsRegex(str string) bool
|
||||
func IsRequestURI(rawurl string) bool
|
||||
func IsRequestURL(rawurl string) bool
|
||||
func IsRipeMD128(str string) bool
|
||||
func IsRipeMD160(str string) bool
|
||||
func IsRsaPub(str string, params ...string) bool
|
||||
func IsRsaPublicKey(str string, keylen int) bool
|
||||
func IsSHA1(str string) bool
|
||||
func IsSHA256(str string) bool
|
||||
func IsSHA384(str string) bool
|
||||
func IsSHA512(str string) bool
|
||||
func IsSSN(str string) bool
|
||||
func IsSemver(str string) bool
|
||||
func IsTiger128(str string) bool
|
||||
func IsTiger160(str string) bool
|
||||
func IsTiger192(str string) bool
|
||||
func IsTime(str string, format string) bool
|
||||
func IsType(v interface{}, params ...string) bool
|
||||
func IsURL(str string) bool
|
||||
func IsUTFDigit(str string) bool
|
||||
func IsUTFLetter(str string) bool
|
||||
func IsUTFLetterNumeric(str string) bool
|
||||
func IsUTFNumeric(str string) bool
|
||||
func IsUUID(str string) bool
|
||||
func IsUUIDv3(str string) bool
|
||||
func IsUUIDv4(str string) bool
|
||||
func IsUUIDv5(str string) bool
|
||||
func IsULID(str string) bool
|
||||
func IsUnixTime(str string) bool
|
||||
func IsUpperCase(str string) bool
|
||||
func IsVariableWidth(str string) bool
|
||||
func IsWhole(value float64) bool
|
||||
func LeftTrim(str, chars string) string
|
||||
func Map(array []interface{}, iterator ResultIterator) []interface{}
|
||||
func Matches(str, pattern string) bool
|
||||
func MaxStringLength(str string, params ...string) bool
|
||||
func MinStringLength(str string, params ...string) bool
|
||||
func NormalizeEmail(str string) (string, error)
|
||||
func PadBoth(str string, padStr string, padLen int) string
|
||||
func PadLeft(str string, padStr string, padLen int) string
|
||||
func PadRight(str string, padStr string, padLen int) string
|
||||
func PrependPathToErrors(err error, path string) error
|
||||
func Range(str string, params ...string) bool
|
||||
func RemoveTags(s string) string
|
||||
func ReplacePattern(str, pattern, replace string) string
|
||||
func Reverse(s string) string
|
||||
func RightTrim(str, chars string) string
|
||||
func RuneLength(str string, params ...string) bool
|
||||
func SafeFileName(str string) string
|
||||
func SetFieldsRequiredByDefault(value bool)
|
||||
func SetNilPtrAllowedByRequired(value bool)
|
||||
func Sign(value float64) float64
|
||||
func StringLength(str string, params ...string) bool
|
||||
func StringMatches(s string, params ...string) bool
|
||||
func StripLow(str string, keepNewLines bool) string
|
||||
func ToBoolean(str string) (bool, error)
|
||||
func ToFloat(str string) (float64, error)
|
||||
func ToInt(value interface{}) (res int64, err error)
|
||||
func ToJSON(obj interface{}) (string, error)
|
||||
func ToString(obj interface{}) string
|
||||
func Trim(str, chars string) string
|
||||
func Truncate(str string, length int, ending string) string
|
||||
func TruncatingErrorf(str string, args ...interface{}) error
|
||||
func UnderscoreToCamelCase(s string) string
|
||||
func ValidateMap(inputMap map[string]interface{}, validationMap map[string]interface{}) (bool, error)
|
||||
func ValidateStruct(s interface{}) (bool, error)
|
||||
func WhiteList(str, chars string) string
|
||||
type ConditionIterator
|
||||
type CustomTypeValidator
|
||||
type Error
|
||||
func (e Error) Error() string
|
||||
type Errors
|
||||
func (es Errors) Error() string
|
||||
func (es Errors) Errors() []error
|
||||
type ISO3166Entry
|
||||
type ISO693Entry
|
||||
type InterfaceParamValidator
|
||||
type Iterator
|
||||
type ParamValidator
|
||||
type ResultIterator
|
||||
type UnsupportedTypeError
|
||||
func (e *UnsupportedTypeError) Error() string
|
||||
type Validator
|
||||
```
|
||||
|
||||
#### Examples
|
||||
###### IsURL
|
||||
```go
|
||||
println(govalidator.IsURL(`http://user@pass:domain.com/path/page`))
|
||||
```
|
||||
###### IsType
|
||||
```go
|
||||
println(govalidator.IsType("Bob", "string"))
|
||||
println(govalidator.IsType(1, "int"))
|
||||
i := 1
|
||||
println(govalidator.IsType(&i, "*int"))
|
||||
```
|
||||
|
||||
IsType can be used through the tag `type` which is essential for map validation:
|
||||
```go
|
||||
type User struct {
|
||||
Name string `valid:"type(string)"`
|
||||
Age int `valid:"type(int)"`
|
||||
Meta interface{} `valid:"type(string)"`
|
||||
}
|
||||
result, err := govalidator.ValidateStruct(User{"Bob", 20, "meta"})
|
||||
if err != nil {
|
||||
println("error: " + err.Error())
|
||||
}
|
||||
println(result)
|
||||
```
|
||||
###### ToString
|
||||
```go
|
||||
type User struct {
|
||||
FirstName string
|
||||
LastName string
|
||||
}
|
||||
|
||||
str := govalidator.ToString(&User{"John", "Juan"})
|
||||
println(str)
|
||||
```
|
||||
###### Each, Map, Filter, Count for slices
|
||||
Each iterates over the slice/array and calls Iterator for every item
|
||||
```go
|
||||
data := []interface{}{1, 2, 3, 4, 5}
|
||||
var fn govalidator.Iterator = func(value interface{}, index int) {
|
||||
println(value.(int))
|
||||
}
|
||||
govalidator.Each(data, fn)
|
||||
```
|
||||
```go
|
||||
data := []interface{}{1, 2, 3, 4, 5}
|
||||
var fn govalidator.ResultIterator = func(value interface{}, index int) interface{} {
|
||||
return value.(int) * 3
|
||||
}
|
||||
_ = govalidator.Map(data, fn) // result = []interface{}{1, 6, 9, 12, 15}
|
||||
```
|
||||
```go
|
||||
data := []interface{}{1, 2, 3, 4, 5, 6, 7, 8, 9, 10}
|
||||
var fn govalidator.ConditionIterator = func(value interface{}, index int) bool {
|
||||
return value.(int)%2 == 0
|
||||
}
|
||||
_ = govalidator.Filter(data, fn) // result = []interface{}{2, 4, 6, 8, 10}
|
||||
_ = govalidator.Count(data, fn) // result = 5
|
||||
```
|
||||
###### ValidateStruct [#2](https://github.com/asaskevich/govalidator/pull/2)
|
||||
If you want to validate structs, you can use tag `valid` for any field in your structure. All validators used with this field in one tag are separated by comma. If you want to skip validation, place `-` in your tag. If you need a validator that is not on the list below, you can add it like this:
|
||||
```go
|
||||
govalidator.TagMap["duck"] = govalidator.Validator(func(str string) bool {
|
||||
return str == "duck"
|
||||
})
|
||||
```
|
||||
For completely custom validators (interface-based), see below.
|
||||
|
||||
Here is a list of available validators for struct fields (validator - used function):
|
||||
```go
|
||||
"email": IsEmail,
|
||||
"url": IsURL,
|
||||
"dialstring": IsDialString,
|
||||
"requrl": IsRequestURL,
|
||||
"requri": IsRequestURI,
|
||||
"alpha": IsAlpha,
|
||||
"utfletter": IsUTFLetter,
|
||||
"alphanum": IsAlphanumeric,
|
||||
"utfletternum": IsUTFLetterNumeric,
|
||||
"numeric": IsNumeric,
|
||||
"utfnumeric": IsUTFNumeric,
|
||||
"utfdigit": IsUTFDigit,
|
||||
"hexadecimal": IsHexadecimal,
|
||||
"hexcolor": IsHexcolor,
|
||||
"rgbcolor": IsRGBcolor,
|
||||
"lowercase": IsLowerCase,
|
||||
"uppercase": IsUpperCase,
|
||||
"int": IsInt,
|
||||
"float": IsFloat,
|
||||
"null": IsNull,
|
||||
"uuid": IsUUID,
|
||||
"uuidv3": IsUUIDv3,
|
||||
"uuidv4": IsUUIDv4,
|
||||
"uuidv5": IsUUIDv5,
|
||||
"creditcard": IsCreditCard,
|
||||
"isbn10": IsISBN10,
|
||||
"isbn13": IsISBN13,
|
||||
"json": IsJSON,
|
||||
"multibyte": IsMultibyte,
|
||||
"ascii": IsASCII,
|
||||
"printableascii": IsPrintableASCII,
|
||||
"fullwidth": IsFullWidth,
|
||||
"halfwidth": IsHalfWidth,
|
||||
"variablewidth": IsVariableWidth,
|
||||
"base64": IsBase64,
|
||||
"datauri": IsDataURI,
|
||||
"ip": IsIP,
|
||||
"port": IsPort,
|
||||
"ipv4": IsIPv4,
|
||||
"ipv6": IsIPv6,
|
||||
"dns": IsDNSName,
|
||||
"host": IsHost,
|
||||
"mac": IsMAC,
|
||||
"latitude": IsLatitude,
|
||||
"longitude": IsLongitude,
|
||||
"ssn": IsSSN,
|
||||
"semver": IsSemver,
|
||||
"rfc3339": IsRFC3339,
|
||||
"rfc3339WithoutZone": IsRFC3339WithoutZone,
|
||||
"ISO3166Alpha2": IsISO3166Alpha2,
|
||||
"ISO3166Alpha3": IsISO3166Alpha3,
|
||||
"ulid": IsULID,
|
||||
```
|
||||
Validators with parameters
|
||||
|
||||
```go
|
||||
"range(min|max)": Range,
|
||||
"length(min|max)": ByteLength,
|
||||
"runelength(min|max)": RuneLength,
|
||||
"stringlength(min|max)": StringLength,
|
||||
"matches(pattern)": StringMatches,
|
||||
"in(string1|string2|...|stringN)": IsIn,
|
||||
"rsapub(keylength)" : IsRsaPub,
|
||||
"minstringlength(int): MinStringLength,
|
||||
"maxstringlength(int): MaxStringLength,
|
||||
```
|
||||
Validators with parameters for any type
|
||||
|
||||
```go
|
||||
"type(type)": IsType,
|
||||
```
|
||||
|
||||
And here is small example of usage:
|
||||
```go
|
||||
type Post struct {
|
||||
Title string `valid:"alphanum,required"`
|
||||
Message string `valid:"duck,ascii"`
|
||||
Message2 string `valid:"animal(dog)"`
|
||||
AuthorIP string `valid:"ipv4"`
|
||||
Date string `valid:"-"`
|
||||
}
|
||||
post := &Post{
|
||||
Title: "My Example Post",
|
||||
Message: "duck",
|
||||
Message2: "dog",
|
||||
AuthorIP: "123.234.54.3",
|
||||
}
|
||||
|
||||
// Add your own struct validation tags
|
||||
govalidator.TagMap["duck"] = govalidator.Validator(func(str string) bool {
|
||||
return str == "duck"
|
||||
})
|
||||
|
||||
// Add your own struct validation tags with parameter
|
||||
govalidator.ParamTagMap["animal"] = govalidator.ParamValidator(func(str string, params ...string) bool {
|
||||
species := params[0]
|
||||
return str == species
|
||||
})
|
||||
govalidator.ParamTagRegexMap["animal"] = regexp.MustCompile("^animal\\((\\w+)\\)$")
|
||||
|
||||
result, err := govalidator.ValidateStruct(post)
|
||||
if err != nil {
|
||||
println("error: " + err.Error())
|
||||
}
|
||||
println(result)
|
||||
```
|
||||
###### ValidateMap [#2](https://github.com/asaskevich/govalidator/pull/338)
|
||||
If you want to validate maps, you can use the map to be validated and a validation map that contain the same tags used in ValidateStruct, both maps have to be in the form `map[string]interface{}`
|
||||
|
||||
So here is small example of usage:
|
||||
```go
|
||||
var mapTemplate = map[string]interface{}{
|
||||
"name":"required,alpha",
|
||||
"family":"required,alpha",
|
||||
"email":"required,email",
|
||||
"cell-phone":"numeric",
|
||||
"address":map[string]interface{}{
|
||||
"line1":"required,alphanum",
|
||||
"line2":"alphanum",
|
||||
"postal-code":"numeric",
|
||||
},
|
||||
}
|
||||
|
||||
var inputMap = map[string]interface{}{
|
||||
"name":"Bob",
|
||||
"family":"Smith",
|
||||
"email":"foo@bar.baz",
|
||||
"address":map[string]interface{}{
|
||||
"line1":"",
|
||||
"line2":"",
|
||||
"postal-code":"",
|
||||
},
|
||||
}
|
||||
|
||||
result, err := govalidator.ValidateMap(inputMap, mapTemplate)
|
||||
if err != nil {
|
||||
println("error: " + err.Error())
|
||||
}
|
||||
println(result)
|
||||
```
|
||||
|
||||
###### WhiteList
|
||||
```go
|
||||
// Remove all characters from string ignoring characters between "a" and "z"
|
||||
println(govalidator.WhiteList("a3a43a5a4a3a2a23a4a5a4a3a4", "a-z") == "aaaaaaaaaaaa")
|
||||
```
|
||||
|
||||
###### Custom validation functions
|
||||
Custom validation using your own domain specific validators is also available - here's an example of how to use it:
|
||||
```go
|
||||
import "github.com/asaskevich/govalidator"
|
||||
|
||||
type CustomByteArray [6]byte // custom types are supported and can be validated
|
||||
|
||||
type StructWithCustomByteArray struct {
|
||||
ID CustomByteArray `valid:"customByteArrayValidator,customMinLengthValidator"` // multiple custom validators are possible as well and will be evaluated in sequence
|
||||
Email string `valid:"email"`
|
||||
CustomMinLength int `valid:"-"`
|
||||
}
|
||||
|
||||
govalidator.CustomTypeTagMap.Set("customByteArrayValidator", func(i interface{}, context interface{}) bool {
|
||||
switch v := context.(type) { // you can type switch on the context interface being validated
|
||||
case StructWithCustomByteArray:
|
||||
// you can check and validate against some other field in the context,
|
||||
// return early or not validate against the context at all – your choice
|
||||
case SomeOtherType:
|
||||
// ...
|
||||
default:
|
||||
// expecting some other type? Throw/panic here or continue
|
||||
}
|
||||
|
||||
switch v := i.(type) { // type switch on the struct field being validated
|
||||
case CustomByteArray:
|
||||
for _, e := range v { // this validator checks that the byte array is not empty, i.e. not all zeroes
|
||||
if e != 0 {
|
||||
return true
|
||||
}
|
||||
}
|
||||
}
|
||||
return false
|
||||
})
|
||||
govalidator.CustomTypeTagMap.Set("customMinLengthValidator", func(i interface{}, context interface{}) bool {
|
||||
switch v := context.(type) { // this validates a field against the value in another field, i.e. dependent validation
|
||||
case StructWithCustomByteArray:
|
||||
return len(v.ID) >= v.CustomMinLength
|
||||
}
|
||||
return false
|
||||
})
|
||||
```
|
||||
|
||||
###### Loop over Error()
|
||||
By default .Error() returns all errors in a single String. To access each error you can do this:
|
||||
```go
|
||||
if err != nil {
|
||||
errs := err.(govalidator.Errors).Errors()
|
||||
for _, e := range errs {
|
||||
fmt.Println(e.Error())
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
###### Custom error messages
|
||||
Custom error messages are supported via annotations by adding the `~` separator - here's an example of how to use it:
|
||||
```go
|
||||
type Ticket struct {
|
||||
Id int64 `json:"id"`
|
||||
FirstName string `json:"firstname" valid:"required~First name is blank"`
|
||||
}
|
||||
```
|
||||
|
||||
#### Notes
|
||||
Documentation is available here: [godoc.org](https://godoc.org/github.com/asaskevich/govalidator).
|
||||
Full information about code coverage is also available here: [govalidator on gocover.io](http://gocover.io/github.com/asaskevich/govalidator).
|
||||
|
||||
#### Support
|
||||
If you do have a contribution to the package, feel free to create a Pull Request or an Issue.
|
||||
|
||||
#### What to contribute
|
||||
If you don't know what to do, there are some features and functions that need to be done
|
||||
|
||||
- [ ] Refactor code
|
||||
- [ ] Edit docs and [README](https://github.com/asaskevich/govalidator/README.md): spellcheck, grammar and typo check
|
||||
- [ ] Create actual list of contributors and projects that currently using this package
|
||||
- [ ] Resolve [issues and bugs](https://github.com/asaskevich/govalidator/issues)
|
||||
- [ ] Update actual [list of functions](https://github.com/asaskevich/govalidator#list-of-functions)
|
||||
- [ ] Update [list of validators](https://github.com/asaskevich/govalidator#validatestruct-2) that available for `ValidateStruct` and add new
|
||||
- [ ] Implement new validators: `IsFQDN`, `IsIMEI`, `IsPostalCode`, `IsISIN`, `IsISRC` etc
|
||||
- [x] Implement [validation by maps](https://github.com/asaskevich/govalidator/issues/224)
|
||||
- [ ] Implement fuzzing testing
|
||||
- [ ] Implement some struct/map/array utilities
|
||||
- [ ] Implement map/array validation
|
||||
- [ ] Implement benchmarking
|
||||
- [ ] Implement batch of examples
|
||||
- [ ] Look at forks for new features and fixes
|
||||
|
||||
#### Advice
|
||||
Feel free to create what you want, but keep in mind when you implement new features:
|
||||
- Code must be clear and readable, names of variables/constants clearly describes what they are doing
|
||||
- Public functions must be documented and described in source file and added to README.md to the list of available functions
|
||||
- There are must be unit-tests for any new functions and improvements
|
||||
|
||||
## Credits
|
||||
### Contributors
|
||||
|
||||
This project exists thanks to all the people who contribute. [[Contribute](CONTRIBUTING.md)].
|
||||
|
||||
#### Special thanks to [contributors](https://github.com/asaskevich/govalidator/graphs/contributors)
|
||||
* [Daniel Lohse](https://github.com/annismckenzie)
|
||||
* [Attila Oláh](https://github.com/attilaolah)
|
||||
* [Daniel Korner](https://github.com/Dadie)
|
||||
* [Steven Wilkin](https://github.com/stevenwilkin)
|
||||
* [Deiwin Sarjas](https://github.com/deiwin)
|
||||
* [Noah Shibley](https://github.com/slugmobile)
|
||||
* [Nathan Davies](https://github.com/nathj07)
|
||||
* [Matt Sanford](https://github.com/mzsanford)
|
||||
* [Simon ccl1115](https://github.com/ccl1115)
|
||||
|
||||
<a href="https://github.com/asaskevich/govalidator/graphs/contributors"><img src="https://opencollective.com/govalidator/contributors.svg?width=890" /></a>
|
||||
|
||||
|
||||
### Backers
|
||||
|
||||
Thank you to all our backers! 🙏 [[Become a backer](https://opencollective.com/govalidator#backer)]
|
||||
|
||||
<a href="https://opencollective.com/govalidator#backers" target="_blank"><img src="https://opencollective.com/govalidator/backers.svg?width=890"></a>
|
||||
|
||||
|
||||
### Sponsors
|
||||
|
||||
Support this project by becoming a sponsor. Your logo will show up here with a link to your website. [[Become a sponsor](https://opencollective.com/govalidator#sponsor)]
|
||||
|
||||
<a href="https://opencollective.com/govalidator/sponsor/0/website" target="_blank"><img src="https://opencollective.com/govalidator/sponsor/0/avatar.svg"></a>
|
||||
<a href="https://opencollective.com/govalidator/sponsor/1/website" target="_blank"><img src="https://opencollective.com/govalidator/sponsor/1/avatar.svg"></a>
|
||||
<a href="https://opencollective.com/govalidator/sponsor/2/website" target="_blank"><img src="https://opencollective.com/govalidator/sponsor/2/avatar.svg"></a>
|
||||
<a href="https://opencollective.com/govalidator/sponsor/3/website" target="_blank"><img src="https://opencollective.com/govalidator/sponsor/3/avatar.svg"></a>
|
||||
<a href="https://opencollective.com/govalidator/sponsor/4/website" target="_blank"><img src="https://opencollective.com/govalidator/sponsor/4/avatar.svg"></a>
|
||||
<a href="https://opencollective.com/govalidator/sponsor/5/website" target="_blank"><img src="https://opencollective.com/govalidator/sponsor/5/avatar.svg"></a>
|
||||
<a href="https://opencollective.com/govalidator/sponsor/6/website" target="_blank"><img src="https://opencollective.com/govalidator/sponsor/6/avatar.svg"></a>
|
||||
<a href="https://opencollective.com/govalidator/sponsor/7/website" target="_blank"><img src="https://opencollective.com/govalidator/sponsor/7/avatar.svg"></a>
|
||||
<a href="https://opencollective.com/govalidator/sponsor/8/website" target="_blank"><img src="https://opencollective.com/govalidator/sponsor/8/avatar.svg"></a>
|
||||
<a href="https://opencollective.com/govalidator/sponsor/9/website" target="_blank"><img src="https://opencollective.com/govalidator/sponsor/9/avatar.svg"></a>
|
||||
|
||||
|
||||
|
||||
|
||||
## License
|
||||
[](https://app.fossa.io/projects/git%2Bgithub.com%2Fasaskevich%2Fgovalidator?ref=badge_large)
|
||||
87
vendor/github.com/asaskevich/govalidator/arrays.go
generated
vendored
Normal file
87
vendor/github.com/asaskevich/govalidator/arrays.go
generated
vendored
Normal file
|
|
@ -0,0 +1,87 @@
|
|||
package govalidator
|
||||
|
||||
// Iterator is the function that accepts element of slice/array and its index
|
||||
type Iterator func(interface{}, int)
|
||||
|
||||
// ResultIterator is the function that accepts element of slice/array and its index and returns any result
|
||||
type ResultIterator func(interface{}, int) interface{}
|
||||
|
||||
// ConditionIterator is the function that accepts element of slice/array and its index and returns boolean
|
||||
type ConditionIterator func(interface{}, int) bool
|
||||
|
||||
// ReduceIterator is the function that accepts two element of slice/array and returns result of merging those values
|
||||
type ReduceIterator func(interface{}, interface{}) interface{}
|
||||
|
||||
// Some validates that any item of array corresponds to ConditionIterator. Returns boolean.
|
||||
func Some(array []interface{}, iterator ConditionIterator) bool {
|
||||
res := false
|
||||
for index, data := range array {
|
||||
res = res || iterator(data, index)
|
||||
}
|
||||
return res
|
||||
}
|
||||
|
||||
// Every validates that every item of array corresponds to ConditionIterator. Returns boolean.
|
||||
func Every(array []interface{}, iterator ConditionIterator) bool {
|
||||
res := true
|
||||
for index, data := range array {
|
||||
res = res && iterator(data, index)
|
||||
}
|
||||
return res
|
||||
}
|
||||
|
||||
// Reduce boils down a list of values into a single value by ReduceIterator
|
||||
func Reduce(array []interface{}, iterator ReduceIterator, initialValue interface{}) interface{} {
|
||||
for _, data := range array {
|
||||
initialValue = iterator(initialValue, data)
|
||||
}
|
||||
return initialValue
|
||||
}
|
||||
|
||||
// Each iterates over the slice and apply Iterator to every item
|
||||
func Each(array []interface{}, iterator Iterator) {
|
||||
for index, data := range array {
|
||||
iterator(data, index)
|
||||
}
|
||||
}
|
||||
|
||||
// Map iterates over the slice and apply ResultIterator to every item. Returns new slice as a result.
|
||||
func Map(array []interface{}, iterator ResultIterator) []interface{} {
|
||||
var result = make([]interface{}, len(array))
|
||||
for index, data := range array {
|
||||
result[index] = iterator(data, index)
|
||||
}
|
||||
return result
|
||||
}
|
||||
|
||||
// Find iterates over the slice and apply ConditionIterator to every item. Returns first item that meet ConditionIterator or nil otherwise.
|
||||
func Find(array []interface{}, iterator ConditionIterator) interface{} {
|
||||
for index, data := range array {
|
||||
if iterator(data, index) {
|
||||
return data
|
||||
}
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
// Filter iterates over the slice and apply ConditionIterator to every item. Returns new slice.
|
||||
func Filter(array []interface{}, iterator ConditionIterator) []interface{} {
|
||||
var result = make([]interface{}, 0)
|
||||
for index, data := range array {
|
||||
if iterator(data, index) {
|
||||
result = append(result, data)
|
||||
}
|
||||
}
|
||||
return result
|
||||
}
|
||||
|
||||
// Count iterates over the slice and apply ConditionIterator to every item. Returns count of items that meets ConditionIterator.
|
||||
func Count(array []interface{}, iterator ConditionIterator) int {
|
||||
count := 0
|
||||
for index, data := range array {
|
||||
if iterator(data, index) {
|
||||
count = count + 1
|
||||
}
|
||||
}
|
||||
return count
|
||||
}
|
||||
81
vendor/github.com/asaskevich/govalidator/converter.go
generated
vendored
Normal file
81
vendor/github.com/asaskevich/govalidator/converter.go
generated
vendored
Normal file
|
|
@ -0,0 +1,81 @@
|
|||
package govalidator
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
"fmt"
|
||||
"reflect"
|
||||
"strconv"
|
||||
)
|
||||
|
||||
// ToString convert the input to a string.
|
||||
func ToString(obj interface{}) string {
|
||||
res := fmt.Sprintf("%v", obj)
|
||||
return res
|
||||
}
|
||||
|
||||
// ToJSON convert the input to a valid JSON string
|
||||
func ToJSON(obj interface{}) (string, error) {
|
||||
res, err := json.Marshal(obj)
|
||||
if err != nil {
|
||||
res = []byte("")
|
||||
}
|
||||
return string(res), err
|
||||
}
|
||||
|
||||
// ToFloat convert the input string to a float, or 0.0 if the input is not a float.
|
||||
func ToFloat(value interface{}) (res float64, err error) {
|
||||
val := reflect.ValueOf(value)
|
||||
|
||||
switch value.(type) {
|
||||
case int, int8, int16, int32, int64:
|
||||
res = float64(val.Int())
|
||||
case uint, uint8, uint16, uint32, uint64:
|
||||
res = float64(val.Uint())
|
||||
case float32, float64:
|
||||
res = val.Float()
|
||||
case string:
|
||||
res, err = strconv.ParseFloat(val.String(), 64)
|
||||
if err != nil {
|
||||
res = 0
|
||||
}
|
||||
default:
|
||||
err = fmt.Errorf("ToInt: unknown interface type %T", value)
|
||||
res = 0
|
||||
}
|
||||
|
||||
return
|
||||
}
|
||||
|
||||
// ToInt convert the input string or any int type to an integer type 64, or 0 if the input is not an integer.
|
||||
func ToInt(value interface{}) (res int64, err error) {
|
||||
val := reflect.ValueOf(value)
|
||||
|
||||
switch value.(type) {
|
||||
case int, int8, int16, int32, int64:
|
||||
res = val.Int()
|
||||
case uint, uint8, uint16, uint32, uint64:
|
||||
res = int64(val.Uint())
|
||||
case float32, float64:
|
||||
res = int64(val.Float())
|
||||
case string:
|
||||
if IsInt(val.String()) {
|
||||
res, err = strconv.ParseInt(val.String(), 0, 64)
|
||||
if err != nil {
|
||||
res = 0
|
||||
}
|
||||
} else {
|
||||
err = fmt.Errorf("ToInt: invalid numeric format %g", value)
|
||||
res = 0
|
||||
}
|
||||
default:
|
||||
err = fmt.Errorf("ToInt: unknown interface type %T", value)
|
||||
res = 0
|
||||
}
|
||||
|
||||
return
|
||||
}
|
||||
|
||||
// ToBoolean convert the input string to a boolean.
|
||||
func ToBoolean(str string) (bool, error) {
|
||||
return strconv.ParseBool(str)
|
||||
}
|
||||
3
vendor/github.com/asaskevich/govalidator/doc.go
generated
vendored
Normal file
3
vendor/github.com/asaskevich/govalidator/doc.go
generated
vendored
Normal file
|
|
@ -0,0 +1,3 @@
|
|||
package govalidator
|
||||
|
||||
// A package of validators and sanitizers for strings, structures and collections.
|
||||
47
vendor/github.com/asaskevich/govalidator/error.go
generated
vendored
Normal file
47
vendor/github.com/asaskevich/govalidator/error.go
generated
vendored
Normal file
|
|
@ -0,0 +1,47 @@
|
|||
package govalidator
|
||||
|
||||
import (
|
||||
"sort"
|
||||
"strings"
|
||||
)
|
||||
|
||||
// Errors is an array of multiple errors and conforms to the error interface.
|
||||
type Errors []error
|
||||
|
||||
// Errors returns itself.
|
||||
func (es Errors) Errors() []error {
|
||||
return es
|
||||
}
|
||||
|
||||
func (es Errors) Error() string {
|
||||
var errs []string
|
||||
for _, e := range es {
|
||||
errs = append(errs, e.Error())
|
||||
}
|
||||
sort.Strings(errs)
|
||||
return strings.Join(errs, ";")
|
||||
}
|
||||
|
||||
// Error encapsulates a name, an error and whether there's a custom error message or not.
|
||||
type Error struct {
|
||||
Name string
|
||||
Err error
|
||||
CustomErrorMessageExists bool
|
||||
|
||||
// Validator indicates the name of the validator that failed
|
||||
Validator string
|
||||
Path []string
|
||||
}
|
||||
|
||||
func (e Error) Error() string {
|
||||
if e.CustomErrorMessageExists {
|
||||
return e.Err.Error()
|
||||
}
|
||||
|
||||
errName := e.Name
|
||||
if len(e.Path) > 0 {
|
||||
errName = strings.Join(append(e.Path, e.Name), ".")
|
||||
}
|
||||
|
||||
return errName + ": " + e.Err.Error()
|
||||
}
|
||||
100
vendor/github.com/asaskevich/govalidator/numerics.go
generated
vendored
Normal file
100
vendor/github.com/asaskevich/govalidator/numerics.go
generated
vendored
Normal file
|
|
@ -0,0 +1,100 @@
|
|||
package govalidator
|
||||
|
||||
import (
|
||||
"math"
|
||||
)
|
||||
|
||||
// Abs returns absolute value of number
|
||||
func Abs(value float64) float64 {
|
||||
return math.Abs(value)
|
||||
}
|
||||
|
||||
// Sign returns signum of number: 1 in case of value > 0, -1 in case of value < 0, 0 otherwise
|
||||
func Sign(value float64) float64 {
|
||||
if value > 0 {
|
||||
return 1
|
||||
} else if value < 0 {
|
||||
return -1
|
||||
} else {
|
||||
return 0
|
||||
}
|
||||
}
|
||||
|
||||
// IsNegative returns true if value < 0
|
||||
func IsNegative(value float64) bool {
|
||||
return value < 0
|
||||
}
|
||||
|
||||
// IsPositive returns true if value > 0
|
||||
func IsPositive(value float64) bool {
|
||||
return value > 0
|
||||
}
|
||||
|
||||
// IsNonNegative returns true if value >= 0
|
||||
func IsNonNegative(value float64) bool {
|
||||
return value >= 0
|
||||
}
|
||||
|
||||
// IsNonPositive returns true if value <= 0
|
||||
func IsNonPositive(value float64) bool {
|
||||
return value <= 0
|
||||
}
|
||||
|
||||
// InRangeInt returns true if value lies between left and right border
|
||||
func InRangeInt(value, left, right interface{}) bool {
|
||||
value64, _ := ToInt(value)
|
||||
left64, _ := ToInt(left)
|
||||
right64, _ := ToInt(right)
|
||||
if left64 > right64 {
|
||||
left64, right64 = right64, left64
|
||||
}
|
||||
return value64 >= left64 && value64 <= right64
|
||||
}
|
||||
|
||||
// InRangeFloat32 returns true if value lies between left and right border
|
||||
func InRangeFloat32(value, left, right float32) bool {
|
||||
if left > right {
|
||||
left, right = right, left
|
||||
}
|
||||
return value >= left && value <= right
|
||||
}
|
||||
|
||||
// InRangeFloat64 returns true if value lies between left and right border
|
||||
func InRangeFloat64(value, left, right float64) bool {
|
||||
if left > right {
|
||||
left, right = right, left
|
||||
}
|
||||
return value >= left && value <= right
|
||||
}
|
||||
|
||||
// InRange returns true if value lies between left and right border, generic type to handle int, float32, float64 and string.
|
||||
// All types must the same type.
|
||||
// False if value doesn't lie in range or if it incompatible or not comparable
|
||||
func InRange(value interface{}, left interface{}, right interface{}) bool {
|
||||
switch value.(type) {
|
||||
case int:
|
||||
intValue, _ := ToInt(value)
|
||||
intLeft, _ := ToInt(left)
|
||||
intRight, _ := ToInt(right)
|
||||
return InRangeInt(intValue, intLeft, intRight)
|
||||
case float32, float64:
|
||||
intValue, _ := ToFloat(value)
|
||||
intLeft, _ := ToFloat(left)
|
||||
intRight, _ := ToFloat(right)
|
||||
return InRangeFloat64(intValue, intLeft, intRight)
|
||||
case string:
|
||||
return value.(string) >= left.(string) && value.(string) <= right.(string)
|
||||
default:
|
||||
return false
|
||||
}
|
||||
}
|
||||
|
||||
// IsWhole returns true if value is whole number
|
||||
func IsWhole(value float64) bool {
|
||||
return math.Remainder(value, 1) == 0
|
||||
}
|
||||
|
||||
// IsNatural returns true if value is natural number (positive and whole)
|
||||
func IsNatural(value float64) bool {
|
||||
return IsWhole(value) && IsPositive(value)
|
||||
}
|
||||
113
vendor/github.com/asaskevich/govalidator/patterns.go
generated
vendored
Normal file
113
vendor/github.com/asaskevich/govalidator/patterns.go
generated
vendored
Normal file
|
|
@ -0,0 +1,113 @@
|
|||
package govalidator
|
||||
|
||||
import "regexp"
|
||||
|
||||
// Basic regular expressions for validating strings
|
||||
const (
|
||||
Email string = "^(((([a-zA-Z]|\\d|[!#\\$%&'\\*\\+\\-\\/=\\?\\^_`{\\|}~]|[\\x{00A0}-\\x{D7FF}\\x{F900}-\\x{FDCF}\\x{FDF0}-\\x{FFEF}])+(\\.([a-zA-Z]|\\d|[!#\\$%&'\\*\\+\\-\\/=\\?\\^_`{\\|}~]|[\\x{00A0}-\\x{D7FF}\\x{F900}-\\x{FDCF}\\x{FDF0}-\\x{FFEF}])+)*)|((\\x22)((((\\x20|\\x09)*(\\x0d\\x0a))?(\\x20|\\x09)+)?(([\\x01-\\x08\\x0b\\x0c\\x0e-\\x1f\\x7f]|\\x21|[\\x23-\\x5b]|[\\x5d-\\x7e]|[\\x{00A0}-\\x{D7FF}\\x{F900}-\\x{FDCF}\\x{FDF0}-\\x{FFEF}])|(\\([\\x01-\\x09\\x0b\\x0c\\x0d-\\x7f]|[\\x{00A0}-\\x{D7FF}\\x{F900}-\\x{FDCF}\\x{FDF0}-\\x{FFEF}]))))*(((\\x20|\\x09)*(\\x0d\\x0a))?(\\x20|\\x09)+)?(\\x22)))@((([a-zA-Z]|\\d|[\\x{00A0}-\\x{D7FF}\\x{F900}-\\x{FDCF}\\x{FDF0}-\\x{FFEF}])|(([a-zA-Z]|\\d|[\\x{00A0}-\\x{D7FF}\\x{F900}-\\x{FDCF}\\x{FDF0}-\\x{FFEF}])([a-zA-Z]|\\d|-|\\.|_|~|[\\x{00A0}-\\x{D7FF}\\x{F900}-\\x{FDCF}\\x{FDF0}-\\x{FFEF}])*([a-zA-Z]|\\d|[\\x{00A0}-\\x{D7FF}\\x{F900}-\\x{FDCF}\\x{FDF0}-\\x{FFEF}])))\\.)+(([a-zA-Z]|[\\x{00A0}-\\x{D7FF}\\x{F900}-\\x{FDCF}\\x{FDF0}-\\x{FFEF}])|(([a-zA-Z]|[\\x{00A0}-\\x{D7FF}\\x{F900}-\\x{FDCF}\\x{FDF0}-\\x{FFEF}])([a-zA-Z]|\\d|-|_|~|[\\x{00A0}-\\x{D7FF}\\x{F900}-\\x{FDCF}\\x{FDF0}-\\x{FFEF}])*([a-zA-Z]|[\\x{00A0}-\\x{D7FF}\\x{F900}-\\x{FDCF}\\x{FDF0}-\\x{FFEF}])))\\.?$"
|
||||
CreditCard string = "^(?:4[0-9]{12}(?:[0-9]{3})?|5[1-5][0-9]{14}|(222[1-9]|22[3-9][0-9]|2[3-6][0-9]{2}|27[01][0-9]|2720)[0-9]{12}|6(?:011|5[0-9][0-9])[0-9]{12}|3[47][0-9]{13}|3(?:0[0-5]|[68][0-9])[0-9]{11}|(?:2131|1800|35\\d{3})\\d{11}|6[27][0-9]{14})$"
|
||||
ISBN10 string = "^(?:[0-9]{9}X|[0-9]{10})$"
|
||||
ISBN13 string = "^(?:[0-9]{13})$"
|
||||
UUID3 string = "^[0-9a-f]{8}-[0-9a-f]{4}-3[0-9a-f]{3}-[0-9a-f]{4}-[0-9a-f]{12}$"
|
||||
UUID4 string = "^[0-9a-f]{8}-[0-9a-f]{4}-4[0-9a-f]{3}-[89ab][0-9a-f]{3}-[0-9a-f]{12}$"
|
||||
UUID5 string = "^[0-9a-f]{8}-[0-9a-f]{4}-5[0-9a-f]{3}-[89ab][0-9a-f]{3}-[0-9a-f]{12}$"
|
||||
UUID string = "^[0-9a-f]{8}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{12}$"
|
||||
Alpha string = "^[a-zA-Z]+$"
|
||||
Alphanumeric string = "^[a-zA-Z0-9]+$"
|
||||
Numeric string = "^[0-9]+$"
|
||||
Int string = "^(?:[-+]?(?:0|[1-9][0-9]*))$"
|
||||
Float string = "^(?:[-+]?(?:[0-9]+))?(?:\\.[0-9]*)?(?:[eE][\\+\\-]?(?:[0-9]+))?$"
|
||||
Hexadecimal string = "^[0-9a-fA-F]+$"
|
||||
Hexcolor string = "^#?([0-9a-fA-F]{3}|[0-9a-fA-F]{6})$"
|
||||
RGBcolor string = "^rgb\\(\\s*(0|[1-9]\\d?|1\\d\\d?|2[0-4]\\d|25[0-5])\\s*,\\s*(0|[1-9]\\d?|1\\d\\d?|2[0-4]\\d|25[0-5])\\s*,\\s*(0|[1-9]\\d?|1\\d\\d?|2[0-4]\\d|25[0-5])\\s*\\)$"
|
||||
ASCII string = "^[\x00-\x7F]+$"
|
||||
Multibyte string = "[^\x00-\x7F]"
|
||||
FullWidth string = "[^\u0020-\u007E\uFF61-\uFF9F\uFFA0-\uFFDC\uFFE8-\uFFEE0-9a-zA-Z]"
|
||||
HalfWidth string = "[\u0020-\u007E\uFF61-\uFF9F\uFFA0-\uFFDC\uFFE8-\uFFEE0-9a-zA-Z]"
|
||||
Base64 string = "^(?:[A-Za-z0-9+\\/]{4})*(?:[A-Za-z0-9+\\/]{2}==|[A-Za-z0-9+\\/]{3}=|[A-Za-z0-9+\\/]{4})$"
|
||||
PrintableASCII string = "^[\x20-\x7E]+$"
|
||||
DataURI string = "^data:.+\\/(.+);base64$"
|
||||
MagnetURI string = "^magnet:\\?xt=urn:[a-zA-Z0-9]+:[a-zA-Z0-9]{32,40}&dn=.+&tr=.+$"
|
||||
Latitude string = "^[-+]?([1-8]?\\d(\\.\\d+)?|90(\\.0+)?)$"
|
||||
Longitude string = "^[-+]?(180(\\.0+)?|((1[0-7]\\d)|([1-9]?\\d))(\\.\\d+)?)$"
|
||||
DNSName string = `^([a-zA-Z0-9_]{1}[a-zA-Z0-9_-]{0,62}){1}(\.[a-zA-Z0-9_]{1}[a-zA-Z0-9_-]{0,62})*[\._]?$`
|
||||
IP string = `(([0-9a-fA-F]{1,4}:){7,7}[0-9a-fA-F]{1,4}|([0-9a-fA-F]{1,4}:){1,7}:|([0-9a-fA-F]{1,4}:){1,6}:[0-9a-fA-F]{1,4}|([0-9a-fA-F]{1,4}:){1,5}(:[0-9a-fA-F]{1,4}){1,2}|([0-9a-fA-F]{1,4}:){1,4}(:[0-9a-fA-F]{1,4}){1,3}|([0-9a-fA-F]{1,4}:){1,3}(:[0-9a-fA-F]{1,4}){1,4}|([0-9a-fA-F]{1,4}:){1,2}(:[0-9a-fA-F]{1,4}){1,5}|[0-9a-fA-F]{1,4}:((:[0-9a-fA-F]{1,4}){1,6})|:((:[0-9a-fA-F]{1,4}){1,7}|:)|fe80:(:[0-9a-fA-F]{0,4}){0,4}%[0-9a-zA-Z]{1,}|::(ffff(:0{1,4}){0,1}:){0,1}((25[0-5]|(2[0-4]|1{0,1}[0-9]){0,1}[0-9])\.){3,3}(25[0-5]|(2[0-4]|1{0,1}[0-9]){0,1}[0-9])|([0-9a-fA-F]{1,4}:){1,4}:((25[0-5]|(2[0-4]|1{0,1}[0-9]){0,1}[0-9])\.){3,3}(25[0-5]|(2[0-4]|1{0,1}[0-9]){0,1}[0-9]))`
|
||||
URLSchema string = `((ftp|tcp|udp|wss?|https?):\/\/)`
|
||||
URLUsername string = `(\S+(:\S*)?@)`
|
||||
URLPath string = `((\/|\?|#)[^\s]*)`
|
||||
URLPort string = `(:(\d{1,5}))`
|
||||
URLIP string = `([1-9]\d?|1\d\d|2[01]\d|22[0-3]|24\d|25[0-5])(\.(\d{1,2}|1\d\d|2[0-4]\d|25[0-5])){2}(?:\.([0-9]\d?|1\d\d|2[0-4]\d|25[0-5]))`
|
||||
URLSubdomain string = `((www\.)|([a-zA-Z0-9]+([-_\.]?[a-zA-Z0-9])*[a-zA-Z0-9]\.[a-zA-Z0-9]+))`
|
||||
URL = `^` + URLSchema + `?` + URLUsername + `?` + `((` + URLIP + `|(\[` + IP + `\])|(([a-zA-Z0-9]([a-zA-Z0-9-_]+)?[a-zA-Z0-9]([-\.][a-zA-Z0-9]+)*)|(` + URLSubdomain + `?))?(([a-zA-Z\x{00a1}-\x{ffff}0-9]+-?-?)*[a-zA-Z\x{00a1}-\x{ffff}0-9]+)(?:\.([a-zA-Z\x{00a1}-\x{ffff}]{1,}))?))\.?` + URLPort + `?` + URLPath + `?$`
|
||||
SSN string = `^\d{3}[- ]?\d{2}[- ]?\d{4}$`
|
||||
WinPath string = `^[a-zA-Z]:\\(?:[^\\/:*?"<>|\r\n]+\\)*[^\\/:*?"<>|\r\n]*$`
|
||||
UnixPath string = `^(/[^/\x00]*)+/?$`
|
||||
WinARPath string = `^(?:(?:[a-zA-Z]:|\\\\[a-z0-9_.$●-]+\\[a-z0-9_.$●-]+)\\|\\?[^\\/:*?"<>|\r\n]+\\?)(?:[^\\/:*?"<>|\r\n]+\\)*[^\\/:*?"<>|\r\n]*$`
|
||||
UnixARPath string = `^((\.{0,2}/)?([^/\x00]*))+/?$`
|
||||
Semver string = "^v?(?:0|[1-9]\\d*)\\.(?:0|[1-9]\\d*)\\.(?:0|[1-9]\\d*)(-(0|[1-9]\\d*|\\d*[a-zA-Z-][0-9a-zA-Z-]*)(\\.(0|[1-9]\\d*|\\d*[a-zA-Z-][0-9a-zA-Z-]*))*)?(\\+[0-9a-zA-Z-]+(\\.[0-9a-zA-Z-]+)*)?$"
|
||||
tagName string = "valid"
|
||||
hasLowerCase string = ".*[[:lower:]]"
|
||||
hasUpperCase string = ".*[[:upper:]]"
|
||||
hasWhitespace string = ".*[[:space:]]"
|
||||
hasWhitespaceOnly string = "^[[:space:]]+$"
|
||||
IMEI string = "^[0-9a-f]{14}$|^\\d{15}$|^\\d{18}$"
|
||||
IMSI string = "^\\d{14,15}$"
|
||||
E164 string = `^\+?[1-9]\d{1,14}$`
|
||||
)
|
||||
|
||||
// Used by IsFilePath func
|
||||
const (
|
||||
// Unknown is unresolved OS type
|
||||
Unknown = iota
|
||||
// Win is Windows type
|
||||
Win
|
||||
// Unix is *nix OS types
|
||||
Unix
|
||||
)
|
||||
|
||||
var (
|
||||
userRegexp = regexp.MustCompile("^[a-zA-Z0-9!#$%&'*+/=?^_`{|}~.-]+$")
|
||||
hostRegexp = regexp.MustCompile("^[^\\s]+\\.[^\\s]+$")
|
||||
userDotRegexp = regexp.MustCompile("(^[.]{1})|([.]{1}$)|([.]{2,})")
|
||||
rxEmail = regexp.MustCompile(Email)
|
||||
rxCreditCard = regexp.MustCompile(CreditCard)
|
||||
rxISBN10 = regexp.MustCompile(ISBN10)
|
||||
rxISBN13 = regexp.MustCompile(ISBN13)
|
||||
rxUUID3 = regexp.MustCompile(UUID3)
|
||||
rxUUID4 = regexp.MustCompile(UUID4)
|
||||
rxUUID5 = regexp.MustCompile(UUID5)
|
||||
rxUUID = regexp.MustCompile(UUID)
|
||||
rxAlpha = regexp.MustCompile(Alpha)
|
||||
rxAlphanumeric = regexp.MustCompile(Alphanumeric)
|
||||
rxNumeric = regexp.MustCompile(Numeric)
|
||||
rxInt = regexp.MustCompile(Int)
|
||||
rxFloat = regexp.MustCompile(Float)
|
||||
rxHexadecimal = regexp.MustCompile(Hexadecimal)
|
||||
rxHexcolor = regexp.MustCompile(Hexcolor)
|
||||
rxRGBcolor = regexp.MustCompile(RGBcolor)
|
||||
rxASCII = regexp.MustCompile(ASCII)
|
||||
rxPrintableASCII = regexp.MustCompile(PrintableASCII)
|
||||
rxMultibyte = regexp.MustCompile(Multibyte)
|
||||
rxFullWidth = regexp.MustCompile(FullWidth)
|
||||
rxHalfWidth = regexp.MustCompile(HalfWidth)
|
||||
rxBase64 = regexp.MustCompile(Base64)
|
||||
rxDataURI = regexp.MustCompile(DataURI)
|
||||
rxMagnetURI = regexp.MustCompile(MagnetURI)
|
||||
rxLatitude = regexp.MustCompile(Latitude)
|
||||
rxLongitude = regexp.MustCompile(Longitude)
|
||||
rxDNSName = regexp.MustCompile(DNSName)
|
||||
rxURL = regexp.MustCompile(URL)
|
||||
rxSSN = regexp.MustCompile(SSN)
|
||||
rxWinPath = regexp.MustCompile(WinPath)
|
||||
rxUnixPath = regexp.MustCompile(UnixPath)
|
||||
rxARWinPath = regexp.MustCompile(WinARPath)
|
||||
rxARUnixPath = regexp.MustCompile(UnixARPath)
|
||||
rxSemver = regexp.MustCompile(Semver)
|
||||
rxHasLowerCase = regexp.MustCompile(hasLowerCase)
|
||||
rxHasUpperCase = regexp.MustCompile(hasUpperCase)
|
||||
rxHasWhitespace = regexp.MustCompile(hasWhitespace)
|
||||
rxHasWhitespaceOnly = regexp.MustCompile(hasWhitespaceOnly)
|
||||
rxIMEI = regexp.MustCompile(IMEI)
|
||||
rxIMSI = regexp.MustCompile(IMSI)
|
||||
rxE164 = regexp.MustCompile(E164)
|
||||
)
|
||||
656
vendor/github.com/asaskevich/govalidator/types.go
generated
vendored
Normal file
656
vendor/github.com/asaskevich/govalidator/types.go
generated
vendored
Normal file
|
|
@ -0,0 +1,656 @@
|
|||
package govalidator
|
||||
|
||||
import (
|
||||
"reflect"
|
||||
"regexp"
|
||||
"sort"
|
||||
"sync"
|
||||
)
|
||||
|
||||
// Validator is a wrapper for a validator function that returns bool and accepts string.
|
||||
type Validator func(str string) bool
|
||||
|
||||
// CustomTypeValidator is a wrapper for validator functions that returns bool and accepts any type.
|
||||
// The second parameter should be the context (in the case of validating a struct: the whole object being validated).
|
||||
type CustomTypeValidator func(i interface{}, o interface{}) bool
|
||||
|
||||
// ParamValidator is a wrapper for validator functions that accept additional parameters.
|
||||
type ParamValidator func(str string, params ...string) bool
|
||||
|
||||
// InterfaceParamValidator is a wrapper for functions that accept variants parameters for an interface value
|
||||
type InterfaceParamValidator func(in interface{}, params ...string) bool
|
||||
type tagOptionsMap map[string]tagOption
|
||||
|
||||
func (t tagOptionsMap) orderedKeys() []string {
|
||||
var keys []string
|
||||
for k := range t {
|
||||
keys = append(keys, k)
|
||||
}
|
||||
|
||||
sort.Slice(keys, func(a, b int) bool {
|
||||
return t[keys[a]].order < t[keys[b]].order
|
||||
})
|
||||
|
||||
return keys
|
||||
}
|
||||
|
||||
type tagOption struct {
|
||||
name string
|
||||
customErrorMessage string
|
||||
order int
|
||||
}
|
||||
|
||||
// UnsupportedTypeError is a wrapper for reflect.Type
|
||||
type UnsupportedTypeError struct {
|
||||
Type reflect.Type
|
||||
}
|
||||
|
||||
// stringValues is a slice of reflect.Value holding *reflect.StringValue.
|
||||
// It implements the methods to sort by string.
|
||||
type stringValues []reflect.Value
|
||||
|
||||
// InterfaceParamTagMap is a map of functions accept variants parameters for an interface value
|
||||
var InterfaceParamTagMap = map[string]InterfaceParamValidator{
|
||||
"type": IsType,
|
||||
}
|
||||
|
||||
// InterfaceParamTagRegexMap maps interface param tags to their respective regexes.
|
||||
var InterfaceParamTagRegexMap = map[string]*regexp.Regexp{
|
||||
"type": regexp.MustCompile(`^type\((.*)\)$`),
|
||||
}
|
||||
|
||||
// ParamTagMap is a map of functions accept variants parameters
|
||||
var ParamTagMap = map[string]ParamValidator{
|
||||
"length": ByteLength,
|
||||
"range": Range,
|
||||
"runelength": RuneLength,
|
||||
"stringlength": StringLength,
|
||||
"matches": StringMatches,
|
||||
"in": IsInRaw,
|
||||
"rsapub": IsRsaPub,
|
||||
"minstringlength": MinStringLength,
|
||||
"maxstringlength": MaxStringLength,
|
||||
}
|
||||
|
||||
// ParamTagRegexMap maps param tags to their respective regexes.
|
||||
var ParamTagRegexMap = map[string]*regexp.Regexp{
|
||||
"range": regexp.MustCompile("^range\\((\\d+)\\|(\\d+)\\)$"),
|
||||
"length": regexp.MustCompile("^length\\((\\d+)\\|(\\d+)\\)$"),
|
||||
"runelength": regexp.MustCompile("^runelength\\((\\d+)\\|(\\d+)\\)$"),
|
||||
"stringlength": regexp.MustCompile("^stringlength\\((\\d+)\\|(\\d+)\\)$"),
|
||||
"in": regexp.MustCompile(`^in\((.*)\)`),
|
||||
"matches": regexp.MustCompile(`^matches\((.+)\)$`),
|
||||
"rsapub": regexp.MustCompile("^rsapub\\((\\d+)\\)$"),
|
||||
"minstringlength": regexp.MustCompile("^minstringlength\\((\\d+)\\)$"),
|
||||
"maxstringlength": regexp.MustCompile("^maxstringlength\\((\\d+)\\)$"),
|
||||
}
|
||||
|
||||
type customTypeTagMap struct {
|
||||
validators map[string]CustomTypeValidator
|
||||
|
||||
sync.RWMutex
|
||||
}
|
||||
|
||||
func (tm *customTypeTagMap) Get(name string) (CustomTypeValidator, bool) {
|
||||
tm.RLock()
|
||||
defer tm.RUnlock()
|
||||
v, ok := tm.validators[name]
|
||||
return v, ok
|
||||
}
|
||||
|
||||
func (tm *customTypeTagMap) Set(name string, ctv CustomTypeValidator) {
|
||||
tm.Lock()
|
||||
defer tm.Unlock()
|
||||
tm.validators[name] = ctv
|
||||
}
|
||||
|
||||
// CustomTypeTagMap is a map of functions that can be used as tags for ValidateStruct function.
|
||||
// Use this to validate compound or custom types that need to be handled as a whole, e.g.
|
||||
// `type UUID [16]byte` (this would be handled as an array of bytes).
|
||||
var CustomTypeTagMap = &customTypeTagMap{validators: make(map[string]CustomTypeValidator)}
|
||||
|
||||
// TagMap is a map of functions, that can be used as tags for ValidateStruct function.
|
||||
var TagMap = map[string]Validator{
|
||||
"email": IsEmail,
|
||||
"url": IsURL,
|
||||
"dialstring": IsDialString,
|
||||
"requrl": IsRequestURL,
|
||||
"requri": IsRequestURI,
|
||||
"alpha": IsAlpha,
|
||||
"utfletter": IsUTFLetter,
|
||||
"alphanum": IsAlphanumeric,
|
||||
"utfletternum": IsUTFLetterNumeric,
|
||||
"numeric": IsNumeric,
|
||||
"utfnumeric": IsUTFNumeric,
|
||||
"utfdigit": IsUTFDigit,
|
||||
"hexadecimal": IsHexadecimal,
|
||||
"hexcolor": IsHexcolor,
|
||||
"rgbcolor": IsRGBcolor,
|
||||
"lowercase": IsLowerCase,
|
||||
"uppercase": IsUpperCase,
|
||||
"int": IsInt,
|
||||
"float": IsFloat,
|
||||
"null": IsNull,
|
||||
"notnull": IsNotNull,
|
||||
"uuid": IsUUID,
|
||||
"uuidv3": IsUUIDv3,
|
||||
"uuidv4": IsUUIDv4,
|
||||
"uuidv5": IsUUIDv5,
|
||||
"creditcard": IsCreditCard,
|
||||
"isbn10": IsISBN10,
|
||||
"isbn13": IsISBN13,
|
||||
"json": IsJSON,
|
||||
"multibyte": IsMultibyte,
|
||||
"ascii": IsASCII,
|
||||
"printableascii": IsPrintableASCII,
|
||||
"fullwidth": IsFullWidth,
|
||||
"halfwidth": IsHalfWidth,
|
||||
"variablewidth": IsVariableWidth,
|
||||
"base64": IsBase64,
|
||||
"datauri": IsDataURI,
|
||||
"ip": IsIP,
|
||||
"port": IsPort,
|
||||
"ipv4": IsIPv4,
|
||||
"ipv6": IsIPv6,
|
||||
"dns": IsDNSName,
|
||||
"host": IsHost,
|
||||
"mac": IsMAC,
|
||||
"latitude": IsLatitude,
|
||||
"longitude": IsLongitude,
|
||||
"ssn": IsSSN,
|
||||
"semver": IsSemver,
|
||||
"rfc3339": IsRFC3339,
|
||||
"rfc3339WithoutZone": IsRFC3339WithoutZone,
|
||||
"ISO3166Alpha2": IsISO3166Alpha2,
|
||||
"ISO3166Alpha3": IsISO3166Alpha3,
|
||||
"ISO4217": IsISO4217,
|
||||
"IMEI": IsIMEI,
|
||||
"ulid": IsULID,
|
||||
}
|
||||
|
||||
// ISO3166Entry stores country codes
|
||||
type ISO3166Entry struct {
|
||||
EnglishShortName string
|
||||
FrenchShortName string
|
||||
Alpha2Code string
|
||||
Alpha3Code string
|
||||
Numeric string
|
||||
}
|
||||
|
||||
//ISO3166List based on https://www.iso.org/obp/ui/#search/code/ Code Type "Officially Assigned Codes"
|
||||
var ISO3166List = []ISO3166Entry{
|
||||
{"Afghanistan", "Afghanistan (l')", "AF", "AFG", "004"},
|
||||
{"Albania", "Albanie (l')", "AL", "ALB", "008"},
|
||||
{"Antarctica", "Antarctique (l')", "AQ", "ATA", "010"},
|
||||
{"Algeria", "Algérie (l')", "DZ", "DZA", "012"},
|
||||
{"American Samoa", "Samoa américaines (les)", "AS", "ASM", "016"},
|
||||
{"Andorra", "Andorre (l')", "AD", "AND", "020"},
|
||||
{"Angola", "Angola (l')", "AO", "AGO", "024"},
|
||||
{"Antigua and Barbuda", "Antigua-et-Barbuda", "AG", "ATG", "028"},
|
||||
{"Azerbaijan", "Azerbaïdjan (l')", "AZ", "AZE", "031"},
|
||||
{"Argentina", "Argentine (l')", "AR", "ARG", "032"},
|
||||
{"Australia", "Australie (l')", "AU", "AUS", "036"},
|
||||
{"Austria", "Autriche (l')", "AT", "AUT", "040"},
|
||||
{"Bahamas (the)", "Bahamas (les)", "BS", "BHS", "044"},
|
||||
{"Bahrain", "Bahreïn", "BH", "BHR", "048"},
|
||||
{"Bangladesh", "Bangladesh (le)", "BD", "BGD", "050"},
|
||||
{"Armenia", "Arménie (l')", "AM", "ARM", "051"},
|
||||
{"Barbados", "Barbade (la)", "BB", "BRB", "052"},
|
||||
{"Belgium", "Belgique (la)", "BE", "BEL", "056"},
|
||||
{"Bermuda", "Bermudes (les)", "BM", "BMU", "060"},
|
||||
{"Bhutan", "Bhoutan (le)", "BT", "BTN", "064"},
|
||||
{"Bolivia (Plurinational State of)", "Bolivie (État plurinational de)", "BO", "BOL", "068"},
|
||||
{"Bosnia and Herzegovina", "Bosnie-Herzégovine (la)", "BA", "BIH", "070"},
|
||||
{"Botswana", "Botswana (le)", "BW", "BWA", "072"},
|
||||
{"Bouvet Island", "Bouvet (l'Île)", "BV", "BVT", "074"},
|
||||
{"Brazil", "Brésil (le)", "BR", "BRA", "076"},
|
||||
{"Belize", "Belize (le)", "BZ", "BLZ", "084"},
|
||||
{"British Indian Ocean Territory (the)", "Indien (le Territoire britannique de l'océan)", "IO", "IOT", "086"},
|
||||
{"Solomon Islands", "Salomon (Îles)", "SB", "SLB", "090"},
|
||||
{"Virgin Islands (British)", "Vierges britanniques (les Îles)", "VG", "VGB", "092"},
|
||||
{"Brunei Darussalam", "Brunéi Darussalam (le)", "BN", "BRN", "096"},
|
||||
{"Bulgaria", "Bulgarie (la)", "BG", "BGR", "100"},
|
||||
{"Myanmar", "Myanmar (le)", "MM", "MMR", "104"},
|
||||
{"Burundi", "Burundi (le)", "BI", "BDI", "108"},
|
||||
{"Belarus", "Bélarus (le)", "BY", "BLR", "112"},
|
||||
{"Cambodia", "Cambodge (le)", "KH", "KHM", "116"},
|
||||
{"Cameroon", "Cameroun (le)", "CM", "CMR", "120"},
|
||||
{"Canada", "Canada (le)", "CA", "CAN", "124"},
|
||||
{"Cabo Verde", "Cabo Verde", "CV", "CPV", "132"},
|
||||
{"Cayman Islands (the)", "Caïmans (les Îles)", "KY", "CYM", "136"},
|
||||
{"Central African Republic (the)", "République centrafricaine (la)", "CF", "CAF", "140"},
|
||||
{"Sri Lanka", "Sri Lanka", "LK", "LKA", "144"},
|
||||
{"Chad", "Tchad (le)", "TD", "TCD", "148"},
|
||||
{"Chile", "Chili (le)", "CL", "CHL", "152"},
|
||||
{"China", "Chine (la)", "CN", "CHN", "156"},
|
||||
{"Taiwan (Province of China)", "Taïwan (Province de Chine)", "TW", "TWN", "158"},
|
||||
{"Christmas Island", "Christmas (l'Île)", "CX", "CXR", "162"},
|
||||
{"Cocos (Keeling) Islands (the)", "Cocos (les Îles)/ Keeling (les Îles)", "CC", "CCK", "166"},
|
||||
{"Colombia", "Colombie (la)", "CO", "COL", "170"},
|
||||
{"Comoros (the)", "Comores (les)", "KM", "COM", "174"},
|
||||
{"Mayotte", "Mayotte", "YT", "MYT", "175"},
|
||||
{"Congo (the)", "Congo (le)", "CG", "COG", "178"},
|
||||
{"Congo (the Democratic Republic of the)", "Congo (la République démocratique du)", "CD", "COD", "180"},
|
||||
{"Cook Islands (the)", "Cook (les Îles)", "CK", "COK", "184"},
|
||||
{"Costa Rica", "Costa Rica (le)", "CR", "CRI", "188"},
|
||||
{"Croatia", "Croatie (la)", "HR", "HRV", "191"},
|
||||
{"Cuba", "Cuba", "CU", "CUB", "192"},
|
||||
{"Cyprus", "Chypre", "CY", "CYP", "196"},
|
||||
{"Czech Republic (the)", "tchèque (la République)", "CZ", "CZE", "203"},
|
||||
{"Benin", "Bénin (le)", "BJ", "BEN", "204"},
|
||||
{"Denmark", "Danemark (le)", "DK", "DNK", "208"},
|
||||
{"Dominica", "Dominique (la)", "DM", "DMA", "212"},
|
||||
{"Dominican Republic (the)", "dominicaine (la République)", "DO", "DOM", "214"},
|
||||
{"Ecuador", "Équateur (l')", "EC", "ECU", "218"},
|
||||
{"El Salvador", "El Salvador", "SV", "SLV", "222"},
|
||||
{"Equatorial Guinea", "Guinée équatoriale (la)", "GQ", "GNQ", "226"},
|
||||
{"Ethiopia", "Éthiopie (l')", "ET", "ETH", "231"},
|
||||
{"Eritrea", "Érythrée (l')", "ER", "ERI", "232"},
|
||||
{"Estonia", "Estonie (l')", "EE", "EST", "233"},
|
||||
{"Faroe Islands (the)", "Féroé (les Îles)", "FO", "FRO", "234"},
|
||||
{"Falkland Islands (the) [Malvinas]", "Falkland (les Îles)/Malouines (les Îles)", "FK", "FLK", "238"},
|
||||
{"South Georgia and the South Sandwich Islands", "Géorgie du Sud-et-les Îles Sandwich du Sud (la)", "GS", "SGS", "239"},
|
||||
{"Fiji", "Fidji (les)", "FJ", "FJI", "242"},
|
||||
{"Finland", "Finlande (la)", "FI", "FIN", "246"},
|
||||
{"Åland Islands", "Åland(les Îles)", "AX", "ALA", "248"},
|
||||
{"France", "France (la)", "FR", "FRA", "250"},
|
||||
{"French Guiana", "Guyane française (la )", "GF", "GUF", "254"},
|
||||
{"French Polynesia", "Polynésie française (la)", "PF", "PYF", "258"},
|
||||
{"French Southern Territories (the)", "Terres australes françaises (les)", "TF", "ATF", "260"},
|
||||
{"Djibouti", "Djibouti", "DJ", "DJI", "262"},
|
||||
{"Gabon", "Gabon (le)", "GA", "GAB", "266"},
|
||||
{"Georgia", "Géorgie (la)", "GE", "GEO", "268"},
|
||||
{"Gambia (the)", "Gambie (la)", "GM", "GMB", "270"},
|
||||
{"Palestine, State of", "Palestine, État de", "PS", "PSE", "275"},
|
||||
{"Germany", "Allemagne (l')", "DE", "DEU", "276"},
|
||||
{"Ghana", "Ghana (le)", "GH", "GHA", "288"},
|
||||
{"Gibraltar", "Gibraltar", "GI", "GIB", "292"},
|
||||
{"Kiribati", "Kiribati", "KI", "KIR", "296"},
|
||||
{"Greece", "Grèce (la)", "GR", "GRC", "300"},
|
||||
{"Greenland", "Groenland (le)", "GL", "GRL", "304"},
|
||||
{"Grenada", "Grenade (la)", "GD", "GRD", "308"},
|
||||
{"Guadeloupe", "Guadeloupe (la)", "GP", "GLP", "312"},
|
||||
{"Guam", "Guam", "GU", "GUM", "316"},
|
||||
{"Guatemala", "Guatemala (le)", "GT", "GTM", "320"},
|
||||
{"Guinea", "Guinée (la)", "GN", "GIN", "324"},
|
||||
{"Guyana", "Guyana (le)", "GY", "GUY", "328"},
|
||||
{"Haiti", "Haïti", "HT", "HTI", "332"},
|
||||
{"Heard Island and McDonald Islands", "Heard-et-Îles MacDonald (l'Île)", "HM", "HMD", "334"},
|
||||
{"Holy See (the)", "Saint-Siège (le)", "VA", "VAT", "336"},
|
||||
{"Honduras", "Honduras (le)", "HN", "HND", "340"},
|
||||
{"Hong Kong", "Hong Kong", "HK", "HKG", "344"},
|
||||
{"Hungary", "Hongrie (la)", "HU", "HUN", "348"},
|
||||
{"Iceland", "Islande (l')", "IS", "ISL", "352"},
|
||||
{"India", "Inde (l')", "IN", "IND", "356"},
|
||||
{"Indonesia", "Indonésie (l')", "ID", "IDN", "360"},
|
||||
{"Iran (Islamic Republic of)", "Iran (République Islamique d')", "IR", "IRN", "364"},
|
||||
{"Iraq", "Iraq (l')", "IQ", "IRQ", "368"},
|
||||
{"Ireland", "Irlande (l')", "IE", "IRL", "372"},
|
||||
{"Israel", "Israël", "IL", "ISR", "376"},
|
||||
{"Italy", "Italie (l')", "IT", "ITA", "380"},
|
||||
{"Côte d'Ivoire", "Côte d'Ivoire (la)", "CI", "CIV", "384"},
|
||||
{"Jamaica", "Jamaïque (la)", "JM", "JAM", "388"},
|
||||
{"Japan", "Japon (le)", "JP", "JPN", "392"},
|
||||
{"Kazakhstan", "Kazakhstan (le)", "KZ", "KAZ", "398"},
|
||||
{"Jordan", "Jordanie (la)", "JO", "JOR", "400"},
|
||||
{"Kenya", "Kenya (le)", "KE", "KEN", "404"},
|
||||
{"Korea (the Democratic People's Republic of)", "Corée (la République populaire démocratique de)", "KP", "PRK", "408"},
|
||||
{"Korea (the Republic of)", "Corée (la République de)", "KR", "KOR", "410"},
|
||||
{"Kuwait", "Koweït (le)", "KW", "KWT", "414"},
|
||||
{"Kyrgyzstan", "Kirghizistan (le)", "KG", "KGZ", "417"},
|
||||
{"Lao People's Democratic Republic (the)", "Lao, République démocratique populaire", "LA", "LAO", "418"},
|
||||
{"Lebanon", "Liban (le)", "LB", "LBN", "422"},
|
||||
{"Lesotho", "Lesotho (le)", "LS", "LSO", "426"},
|
||||
{"Latvia", "Lettonie (la)", "LV", "LVA", "428"},
|
||||
{"Liberia", "Libéria (le)", "LR", "LBR", "430"},
|
||||
{"Libya", "Libye (la)", "LY", "LBY", "434"},
|
||||
{"Liechtenstein", "Liechtenstein (le)", "LI", "LIE", "438"},
|
||||
{"Lithuania", "Lituanie (la)", "LT", "LTU", "440"},
|
||||
{"Luxembourg", "Luxembourg (le)", "LU", "LUX", "442"},
|
||||
{"Macao", "Macao", "MO", "MAC", "446"},
|
||||
{"Madagascar", "Madagascar", "MG", "MDG", "450"},
|
||||
{"Malawi", "Malawi (le)", "MW", "MWI", "454"},
|
||||
{"Malaysia", "Malaisie (la)", "MY", "MYS", "458"},
|
||||
{"Maldives", "Maldives (les)", "MV", "MDV", "462"},
|
||||
{"Mali", "Mali (le)", "ML", "MLI", "466"},
|
||||
{"Malta", "Malte", "MT", "MLT", "470"},
|
||||
{"Martinique", "Martinique (la)", "MQ", "MTQ", "474"},
|
||||
{"Mauritania", "Mauritanie (la)", "MR", "MRT", "478"},
|
||||
{"Mauritius", "Maurice", "MU", "MUS", "480"},
|
||||
{"Mexico", "Mexique (le)", "MX", "MEX", "484"},
|
||||
{"Monaco", "Monaco", "MC", "MCO", "492"},
|
||||
{"Mongolia", "Mongolie (la)", "MN", "MNG", "496"},
|
||||
{"Moldova (the Republic of)", "Moldova , République de", "MD", "MDA", "498"},
|
||||
{"Montenegro", "Monténégro (le)", "ME", "MNE", "499"},
|
||||
{"Montserrat", "Montserrat", "MS", "MSR", "500"},
|
||||
{"Morocco", "Maroc (le)", "MA", "MAR", "504"},
|
||||
{"Mozambique", "Mozambique (le)", "MZ", "MOZ", "508"},
|
||||
{"Oman", "Oman", "OM", "OMN", "512"},
|
||||
{"Namibia", "Namibie (la)", "NA", "NAM", "516"},
|
||||
{"Nauru", "Nauru", "NR", "NRU", "520"},
|
||||
{"Nepal", "Népal (le)", "NP", "NPL", "524"},
|
||||
{"Netherlands (the)", "Pays-Bas (les)", "NL", "NLD", "528"},
|
||||
{"Curaçao", "Curaçao", "CW", "CUW", "531"},
|
||||
{"Aruba", "Aruba", "AW", "ABW", "533"},
|
||||
{"Sint Maarten (Dutch part)", "Saint-Martin (partie néerlandaise)", "SX", "SXM", "534"},
|
||||
{"Bonaire, Sint Eustatius and Saba", "Bonaire, Saint-Eustache et Saba", "BQ", "BES", "535"},
|
||||
{"New Caledonia", "Nouvelle-Calédonie (la)", "NC", "NCL", "540"},
|
||||
{"Vanuatu", "Vanuatu (le)", "VU", "VUT", "548"},
|
||||
{"New Zealand", "Nouvelle-Zélande (la)", "NZ", "NZL", "554"},
|
||||
{"Nicaragua", "Nicaragua (le)", "NI", "NIC", "558"},
|
||||
{"Niger (the)", "Niger (le)", "NE", "NER", "562"},
|
||||
{"Nigeria", "Nigéria (le)", "NG", "NGA", "566"},
|
||||
{"Niue", "Niue", "NU", "NIU", "570"},
|
||||
{"Norfolk Island", "Norfolk (l'Île)", "NF", "NFK", "574"},
|
||||
{"Norway", "Norvège (la)", "NO", "NOR", "578"},
|
||||
{"Northern Mariana Islands (the)", "Mariannes du Nord (les Îles)", "MP", "MNP", "580"},
|
||||
{"United States Minor Outlying Islands (the)", "Îles mineures éloignées des États-Unis (les)", "UM", "UMI", "581"},
|
||||
{"Micronesia (Federated States of)", "Micronésie (États fédérés de)", "FM", "FSM", "583"},
|
||||
{"Marshall Islands (the)", "Marshall (Îles)", "MH", "MHL", "584"},
|
||||
{"Palau", "Palaos (les)", "PW", "PLW", "585"},
|
||||
{"Pakistan", "Pakistan (le)", "PK", "PAK", "586"},
|
||||
{"Panama", "Panama (le)", "PA", "PAN", "591"},
|
||||
{"Papua New Guinea", "Papouasie-Nouvelle-Guinée (la)", "PG", "PNG", "598"},
|
||||
{"Paraguay", "Paraguay (le)", "PY", "PRY", "600"},
|
||||
{"Peru", "Pérou (le)", "PE", "PER", "604"},
|
||||
{"Philippines (the)", "Philippines (les)", "PH", "PHL", "608"},
|
||||
{"Pitcairn", "Pitcairn", "PN", "PCN", "612"},
|
||||
{"Poland", "Pologne (la)", "PL", "POL", "616"},
|
||||
{"Portugal", "Portugal (le)", "PT", "PRT", "620"},
|
||||
{"Guinea-Bissau", "Guinée-Bissau (la)", "GW", "GNB", "624"},
|
||||
{"Timor-Leste", "Timor-Leste (le)", "TL", "TLS", "626"},
|
||||
{"Puerto Rico", "Porto Rico", "PR", "PRI", "630"},
|
||||
{"Qatar", "Qatar (le)", "QA", "QAT", "634"},
|
||||
{"Réunion", "Réunion (La)", "RE", "REU", "638"},
|
||||
{"Romania", "Roumanie (la)", "RO", "ROU", "642"},
|
||||
{"Russian Federation (the)", "Russie (la Fédération de)", "RU", "RUS", "643"},
|
||||
{"Rwanda", "Rwanda (le)", "RW", "RWA", "646"},
|
||||
{"Saint Barthélemy", "Saint-Barthélemy", "BL", "BLM", "652"},
|
||||
{"Saint Helena, Ascension and Tristan da Cunha", "Sainte-Hélène, Ascension et Tristan da Cunha", "SH", "SHN", "654"},
|
||||
{"Saint Kitts and Nevis", "Saint-Kitts-et-Nevis", "KN", "KNA", "659"},
|
||||
{"Anguilla", "Anguilla", "AI", "AIA", "660"},
|
||||
{"Saint Lucia", "Sainte-Lucie", "LC", "LCA", "662"},
|
||||
{"Saint Martin (French part)", "Saint-Martin (partie française)", "MF", "MAF", "663"},
|
||||
{"Saint Pierre and Miquelon", "Saint-Pierre-et-Miquelon", "PM", "SPM", "666"},
|
||||
{"Saint Vincent and the Grenadines", "Saint-Vincent-et-les Grenadines", "VC", "VCT", "670"},
|
||||
{"San Marino", "Saint-Marin", "SM", "SMR", "674"},
|
||||
{"Sao Tome and Principe", "Sao Tomé-et-Principe", "ST", "STP", "678"},
|
||||
{"Saudi Arabia", "Arabie saoudite (l')", "SA", "SAU", "682"},
|
||||
{"Senegal", "Sénégal (le)", "SN", "SEN", "686"},
|
||||
{"Serbia", "Serbie (la)", "RS", "SRB", "688"},
|
||||
{"Seychelles", "Seychelles (les)", "SC", "SYC", "690"},
|
||||
{"Sierra Leone", "Sierra Leone (la)", "SL", "SLE", "694"},
|
||||
{"Singapore", "Singapour", "SG", "SGP", "702"},
|
||||
{"Slovakia", "Slovaquie (la)", "SK", "SVK", "703"},
|
||||
{"Viet Nam", "Viet Nam (le)", "VN", "VNM", "704"},
|
||||
{"Slovenia", "Slovénie (la)", "SI", "SVN", "705"},
|
||||
{"Somalia", "Somalie (la)", "SO", "SOM", "706"},
|
||||
{"South Africa", "Afrique du Sud (l')", "ZA", "ZAF", "710"},
|
||||
{"Zimbabwe", "Zimbabwe (le)", "ZW", "ZWE", "716"},
|
||||
{"Spain", "Espagne (l')", "ES", "ESP", "724"},
|
||||
{"South Sudan", "Soudan du Sud (le)", "SS", "SSD", "728"},
|
||||
{"Sudan (the)", "Soudan (le)", "SD", "SDN", "729"},
|
||||
{"Western Sahara*", "Sahara occidental (le)*", "EH", "ESH", "732"},
|
||||
{"Suriname", "Suriname (le)", "SR", "SUR", "740"},
|
||||
{"Svalbard and Jan Mayen", "Svalbard et l'Île Jan Mayen (le)", "SJ", "SJM", "744"},
|
||||
{"Swaziland", "Swaziland (le)", "SZ", "SWZ", "748"},
|
||||
{"Sweden", "Suède (la)", "SE", "SWE", "752"},
|
||||
{"Switzerland", "Suisse (la)", "CH", "CHE", "756"},
|
||||
{"Syrian Arab Republic", "République arabe syrienne (la)", "SY", "SYR", "760"},
|
||||
{"Tajikistan", "Tadjikistan (le)", "TJ", "TJK", "762"},
|
||||
{"Thailand", "Thaïlande (la)", "TH", "THA", "764"},
|
||||
{"Togo", "Togo (le)", "TG", "TGO", "768"},
|
||||
{"Tokelau", "Tokelau (les)", "TK", "TKL", "772"},
|
||||
{"Tonga", "Tonga (les)", "TO", "TON", "776"},
|
||||
{"Trinidad and Tobago", "Trinité-et-Tobago (la)", "TT", "TTO", "780"},
|
||||
{"United Arab Emirates (the)", "Émirats arabes unis (les)", "AE", "ARE", "784"},
|
||||
{"Tunisia", "Tunisie (la)", "TN", "TUN", "788"},
|
||||
{"Turkey", "Turquie (la)", "TR", "TUR", "792"},
|
||||
{"Turkmenistan", "Turkménistan (le)", "TM", "TKM", "795"},
|
||||
{"Turks and Caicos Islands (the)", "Turks-et-Caïcos (les Îles)", "TC", "TCA", "796"},
|
||||
{"Tuvalu", "Tuvalu (les)", "TV", "TUV", "798"},
|
||||
{"Uganda", "Ouganda (l')", "UG", "UGA", "800"},
|
||||
{"Ukraine", "Ukraine (l')", "UA", "UKR", "804"},
|
||||
{"Macedonia (the former Yugoslav Republic of)", "Macédoine (l'ex‑République yougoslave de)", "MK", "MKD", "807"},
|
||||
{"Egypt", "Égypte (l')", "EG", "EGY", "818"},
|
||||
{"United Kingdom of Great Britain and Northern Ireland (the)", "Royaume-Uni de Grande-Bretagne et d'Irlande du Nord (le)", "GB", "GBR", "826"},
|
||||
{"Guernsey", "Guernesey", "GG", "GGY", "831"},
|
||||
{"Jersey", "Jersey", "JE", "JEY", "832"},
|
||||
{"Isle of Man", "Île de Man", "IM", "IMN", "833"},
|
||||
{"Tanzania, United Republic of", "Tanzanie, République-Unie de", "TZ", "TZA", "834"},
|
||||
{"United States of America (the)", "États-Unis d'Amérique (les)", "US", "USA", "840"},
|
||||
{"Virgin Islands (U.S.)", "Vierges des États-Unis (les Îles)", "VI", "VIR", "850"},
|
||||
{"Burkina Faso", "Burkina Faso (le)", "BF", "BFA", "854"},
|
||||
{"Uruguay", "Uruguay (l')", "UY", "URY", "858"},
|
||||
{"Uzbekistan", "Ouzbékistan (l')", "UZ", "UZB", "860"},
|
||||
{"Venezuela (Bolivarian Republic of)", "Venezuela (République bolivarienne du)", "VE", "VEN", "862"},
|
||||
{"Wallis and Futuna", "Wallis-et-Futuna", "WF", "WLF", "876"},
|
||||
{"Samoa", "Samoa (le)", "WS", "WSM", "882"},
|
||||
{"Yemen", "Yémen (le)", "YE", "YEM", "887"},
|
||||
{"Zambia", "Zambie (la)", "ZM", "ZMB", "894"},
|
||||
}
|
||||
|
||||
// ISO4217List is the list of ISO currency codes
|
||||
var ISO4217List = []string{
|
||||
"AED", "AFN", "ALL", "AMD", "ANG", "AOA", "ARS", "AUD", "AWG", "AZN",
|
||||
"BAM", "BBD", "BDT", "BGN", "BHD", "BIF", "BMD", "BND", "BOB", "BOV", "BRL", "BSD", "BTN", "BWP", "BYN", "BZD",
|
||||
"CAD", "CDF", "CHE", "CHF", "CHW", "CLF", "CLP", "CNY", "COP", "COU", "CRC", "CUC", "CUP", "CVE", "CZK",
|
||||
"DJF", "DKK", "DOP", "DZD",
|
||||
"EGP", "ERN", "ETB", "EUR",
|
||||
"FJD", "FKP",
|
||||
"GBP", "GEL", "GHS", "GIP", "GMD", "GNF", "GTQ", "GYD",
|
||||
"HKD", "HNL", "HRK", "HTG", "HUF",
|
||||
"IDR", "ILS", "INR", "IQD", "IRR", "ISK",
|
||||
"JMD", "JOD", "JPY",
|
||||
"KES", "KGS", "KHR", "KMF", "KPW", "KRW", "KWD", "KYD", "KZT",
|
||||
"LAK", "LBP", "LKR", "LRD", "LSL", "LYD",
|
||||
"MAD", "MDL", "MGA", "MKD", "MMK", "MNT", "MOP", "MRO", "MUR", "MVR", "MWK", "MXN", "MXV", "MYR", "MZN",
|
||||
"NAD", "NGN", "NIO", "NOK", "NPR", "NZD",
|
||||
"OMR",
|
||||
"PAB", "PEN", "PGK", "PHP", "PKR", "PLN", "PYG",
|
||||
"QAR",
|
||||
"RON", "RSD", "RUB", "RWF",
|
||||
"SAR", "SBD", "SCR", "SDG", "SEK", "SGD", "SHP", "SLL", "SOS", "SRD", "SSP", "STD", "STN", "SVC", "SYP", "SZL",
|
||||
"THB", "TJS", "TMT", "TND", "TOP", "TRY", "TTD", "TWD", "TZS",
|
||||
"UAH", "UGX", "USD", "USN", "UYI", "UYU", "UYW", "UZS",
|
||||
"VEF", "VES", "VND", "VUV",
|
||||
"WST",
|
||||
"XAF", "XAG", "XAU", "XBA", "XBB", "XBC", "XBD", "XCD", "XDR", "XOF", "XPD", "XPF", "XPT", "XSU", "XTS", "XUA", "XXX",
|
||||
"YER",
|
||||
"ZAR", "ZMW", "ZWL",
|
||||
}
|
||||
|
||||
// ISO693Entry stores ISO language codes
|
||||
type ISO693Entry struct {
|
||||
Alpha3bCode string
|
||||
Alpha2Code string
|
||||
English string
|
||||
}
|
||||
|
||||
//ISO693List based on http://data.okfn.org/data/core/language-codes/r/language-codes-3b2.json
|
||||
var ISO693List = []ISO693Entry{
|
||||
{Alpha3bCode: "aar", Alpha2Code: "aa", English: "Afar"},
|
||||
{Alpha3bCode: "abk", Alpha2Code: "ab", English: "Abkhazian"},
|
||||
{Alpha3bCode: "afr", Alpha2Code: "af", English: "Afrikaans"},
|
||||
{Alpha3bCode: "aka", Alpha2Code: "ak", English: "Akan"},
|
||||
{Alpha3bCode: "alb", Alpha2Code: "sq", English: "Albanian"},
|
||||
{Alpha3bCode: "amh", Alpha2Code: "am", English: "Amharic"},
|
||||
{Alpha3bCode: "ara", Alpha2Code: "ar", English: "Arabic"},
|
||||
{Alpha3bCode: "arg", Alpha2Code: "an", English: "Aragonese"},
|
||||
{Alpha3bCode: "arm", Alpha2Code: "hy", English: "Armenian"},
|
||||
{Alpha3bCode: "asm", Alpha2Code: "as", English: "Assamese"},
|
||||
{Alpha3bCode: "ava", Alpha2Code: "av", English: "Avaric"},
|
||||
{Alpha3bCode: "ave", Alpha2Code: "ae", English: "Avestan"},
|
||||
{Alpha3bCode: "aym", Alpha2Code: "ay", English: "Aymara"},
|
||||
{Alpha3bCode: "aze", Alpha2Code: "az", English: "Azerbaijani"},
|
||||
{Alpha3bCode: "bak", Alpha2Code: "ba", English: "Bashkir"},
|
||||
{Alpha3bCode: "bam", Alpha2Code: "bm", English: "Bambara"},
|
||||
{Alpha3bCode: "baq", Alpha2Code: "eu", English: "Basque"},
|
||||
{Alpha3bCode: "bel", Alpha2Code: "be", English: "Belarusian"},
|
||||
{Alpha3bCode: "ben", Alpha2Code: "bn", English: "Bengali"},
|
||||
{Alpha3bCode: "bih", Alpha2Code: "bh", English: "Bihari languages"},
|
||||
{Alpha3bCode: "bis", Alpha2Code: "bi", English: "Bislama"},
|
||||
{Alpha3bCode: "bos", Alpha2Code: "bs", English: "Bosnian"},
|
||||
{Alpha3bCode: "bre", Alpha2Code: "br", English: "Breton"},
|
||||
{Alpha3bCode: "bul", Alpha2Code: "bg", English: "Bulgarian"},
|
||||
{Alpha3bCode: "bur", Alpha2Code: "my", English: "Burmese"},
|
||||
{Alpha3bCode: "cat", Alpha2Code: "ca", English: "Catalan; Valencian"},
|
||||
{Alpha3bCode: "cha", Alpha2Code: "ch", English: "Chamorro"},
|
||||
{Alpha3bCode: "che", Alpha2Code: "ce", English: "Chechen"},
|
||||
{Alpha3bCode: "chi", Alpha2Code: "zh", English: "Chinese"},
|
||||
{Alpha3bCode: "chu", Alpha2Code: "cu", English: "Church Slavic; Old Slavonic; Church Slavonic; Old Bulgarian; Old Church Slavonic"},
|
||||
{Alpha3bCode: "chv", Alpha2Code: "cv", English: "Chuvash"},
|
||||
{Alpha3bCode: "cor", Alpha2Code: "kw", English: "Cornish"},
|
||||
{Alpha3bCode: "cos", Alpha2Code: "co", English: "Corsican"},
|
||||
{Alpha3bCode: "cre", Alpha2Code: "cr", English: "Cree"},
|
||||
{Alpha3bCode: "cze", Alpha2Code: "cs", English: "Czech"},
|
||||
{Alpha3bCode: "dan", Alpha2Code: "da", English: "Danish"},
|
||||
{Alpha3bCode: "div", Alpha2Code: "dv", English: "Divehi; Dhivehi; Maldivian"},
|
||||
{Alpha3bCode: "dut", Alpha2Code: "nl", English: "Dutch; Flemish"},
|
||||
{Alpha3bCode: "dzo", Alpha2Code: "dz", English: "Dzongkha"},
|
||||
{Alpha3bCode: "eng", Alpha2Code: "en", English: "English"},
|
||||
{Alpha3bCode: "epo", Alpha2Code: "eo", English: "Esperanto"},
|
||||
{Alpha3bCode: "est", Alpha2Code: "et", English: "Estonian"},
|
||||
{Alpha3bCode: "ewe", Alpha2Code: "ee", English: "Ewe"},
|
||||
{Alpha3bCode: "fao", Alpha2Code: "fo", English: "Faroese"},
|
||||
{Alpha3bCode: "fij", Alpha2Code: "fj", English: "Fijian"},
|
||||
{Alpha3bCode: "fin", Alpha2Code: "fi", English: "Finnish"},
|
||||
{Alpha3bCode: "fre", Alpha2Code: "fr", English: "French"},
|
||||
{Alpha3bCode: "fry", Alpha2Code: "fy", English: "Western Frisian"},
|
||||
{Alpha3bCode: "ful", Alpha2Code: "ff", English: "Fulah"},
|
||||
{Alpha3bCode: "geo", Alpha2Code: "ka", English: "Georgian"},
|
||||
{Alpha3bCode: "ger", Alpha2Code: "de", English: "German"},
|
||||
{Alpha3bCode: "gla", Alpha2Code: "gd", English: "Gaelic; Scottish Gaelic"},
|
||||
{Alpha3bCode: "gle", Alpha2Code: "ga", English: "Irish"},
|
||||
{Alpha3bCode: "glg", Alpha2Code: "gl", English: "Galician"},
|
||||
{Alpha3bCode: "glv", Alpha2Code: "gv", English: "Manx"},
|
||||
{Alpha3bCode: "gre", Alpha2Code: "el", English: "Greek, Modern (1453-)"},
|
||||
{Alpha3bCode: "grn", Alpha2Code: "gn", English: "Guarani"},
|
||||
{Alpha3bCode: "guj", Alpha2Code: "gu", English: "Gujarati"},
|
||||
{Alpha3bCode: "hat", Alpha2Code: "ht", English: "Haitian; Haitian Creole"},
|
||||
{Alpha3bCode: "hau", Alpha2Code: "ha", English: "Hausa"},
|
||||
{Alpha3bCode: "heb", Alpha2Code: "he", English: "Hebrew"},
|
||||
{Alpha3bCode: "her", Alpha2Code: "hz", English: "Herero"},
|
||||
{Alpha3bCode: "hin", Alpha2Code: "hi", English: "Hindi"},
|
||||
{Alpha3bCode: "hmo", Alpha2Code: "ho", English: "Hiri Motu"},
|
||||
{Alpha3bCode: "hrv", Alpha2Code: "hr", English: "Croatian"},
|
||||
{Alpha3bCode: "hun", Alpha2Code: "hu", English: "Hungarian"},
|
||||
{Alpha3bCode: "ibo", Alpha2Code: "ig", English: "Igbo"},
|
||||
{Alpha3bCode: "ice", Alpha2Code: "is", English: "Icelandic"},
|
||||
{Alpha3bCode: "ido", Alpha2Code: "io", English: "Ido"},
|
||||
{Alpha3bCode: "iii", Alpha2Code: "ii", English: "Sichuan Yi; Nuosu"},
|
||||
{Alpha3bCode: "iku", Alpha2Code: "iu", English: "Inuktitut"},
|
||||
{Alpha3bCode: "ile", Alpha2Code: "ie", English: "Interlingue; Occidental"},
|
||||
{Alpha3bCode: "ina", Alpha2Code: "ia", English: "Interlingua (International Auxiliary Language Association)"},
|
||||
{Alpha3bCode: "ind", Alpha2Code: "id", English: "Indonesian"},
|
||||
{Alpha3bCode: "ipk", Alpha2Code: "ik", English: "Inupiaq"},
|
||||
{Alpha3bCode: "ita", Alpha2Code: "it", English: "Italian"},
|
||||
{Alpha3bCode: "jav", Alpha2Code: "jv", English: "Javanese"},
|
||||
{Alpha3bCode: "jpn", Alpha2Code: "ja", English: "Japanese"},
|
||||
{Alpha3bCode: "kal", Alpha2Code: "kl", English: "Kalaallisut; Greenlandic"},
|
||||
{Alpha3bCode: "kan", Alpha2Code: "kn", English: "Kannada"},
|
||||
{Alpha3bCode: "kas", Alpha2Code: "ks", English: "Kashmiri"},
|
||||
{Alpha3bCode: "kau", Alpha2Code: "kr", English: "Kanuri"},
|
||||
{Alpha3bCode: "kaz", Alpha2Code: "kk", English: "Kazakh"},
|
||||
{Alpha3bCode: "khm", Alpha2Code: "km", English: "Central Khmer"},
|
||||
{Alpha3bCode: "kik", Alpha2Code: "ki", English: "Kikuyu; Gikuyu"},
|
||||
{Alpha3bCode: "kin", Alpha2Code: "rw", English: "Kinyarwanda"},
|
||||
{Alpha3bCode: "kir", Alpha2Code: "ky", English: "Kirghiz; Kyrgyz"},
|
||||
{Alpha3bCode: "kom", Alpha2Code: "kv", English: "Komi"},
|
||||
{Alpha3bCode: "kon", Alpha2Code: "kg", English: "Kongo"},
|
||||
{Alpha3bCode: "kor", Alpha2Code: "ko", English: "Korean"},
|
||||
{Alpha3bCode: "kua", Alpha2Code: "kj", English: "Kuanyama; Kwanyama"},
|
||||
{Alpha3bCode: "kur", Alpha2Code: "ku", English: "Kurdish"},
|
||||
{Alpha3bCode: "lao", Alpha2Code: "lo", English: "Lao"},
|
||||
{Alpha3bCode: "lat", Alpha2Code: "la", English: "Latin"},
|
||||
{Alpha3bCode: "lav", Alpha2Code: "lv", English: "Latvian"},
|
||||
{Alpha3bCode: "lim", Alpha2Code: "li", English: "Limburgan; Limburger; Limburgish"},
|
||||
{Alpha3bCode: "lin", Alpha2Code: "ln", English: "Lingala"},
|
||||
{Alpha3bCode: "lit", Alpha2Code: "lt", English: "Lithuanian"},
|
||||
{Alpha3bCode: "ltz", Alpha2Code: "lb", English: "Luxembourgish; Letzeburgesch"},
|
||||
{Alpha3bCode: "lub", Alpha2Code: "lu", English: "Luba-Katanga"},
|
||||
{Alpha3bCode: "lug", Alpha2Code: "lg", English: "Ganda"},
|
||||
{Alpha3bCode: "mac", Alpha2Code: "mk", English: "Macedonian"},
|
||||
{Alpha3bCode: "mah", Alpha2Code: "mh", English: "Marshallese"},
|
||||
{Alpha3bCode: "mal", Alpha2Code: "ml", English: "Malayalam"},
|
||||
{Alpha3bCode: "mao", Alpha2Code: "mi", English: "Maori"},
|
||||
{Alpha3bCode: "mar", Alpha2Code: "mr", English: "Marathi"},
|
||||
{Alpha3bCode: "may", Alpha2Code: "ms", English: "Malay"},
|
||||
{Alpha3bCode: "mlg", Alpha2Code: "mg", English: "Malagasy"},
|
||||
{Alpha3bCode: "mlt", Alpha2Code: "mt", English: "Maltese"},
|
||||
{Alpha3bCode: "mon", Alpha2Code: "mn", English: "Mongolian"},
|
||||
{Alpha3bCode: "nau", Alpha2Code: "na", English: "Nauru"},
|
||||
{Alpha3bCode: "nav", Alpha2Code: "nv", English: "Navajo; Navaho"},
|
||||
{Alpha3bCode: "nbl", Alpha2Code: "nr", English: "Ndebele, South; South Ndebele"},
|
||||
{Alpha3bCode: "nde", Alpha2Code: "nd", English: "Ndebele, North; North Ndebele"},
|
||||
{Alpha3bCode: "ndo", Alpha2Code: "ng", English: "Ndonga"},
|
||||
{Alpha3bCode: "nep", Alpha2Code: "ne", English: "Nepali"},
|
||||
{Alpha3bCode: "nno", Alpha2Code: "nn", English: "Norwegian Nynorsk; Nynorsk, Norwegian"},
|
||||
{Alpha3bCode: "nob", Alpha2Code: "nb", English: "Bokmål, Norwegian; Norwegian Bokmål"},
|
||||
{Alpha3bCode: "nor", Alpha2Code: "no", English: "Norwegian"},
|
||||
{Alpha3bCode: "nya", Alpha2Code: "ny", English: "Chichewa; Chewa; Nyanja"},
|
||||
{Alpha3bCode: "oci", Alpha2Code: "oc", English: "Occitan (post 1500); Provençal"},
|
||||
{Alpha3bCode: "oji", Alpha2Code: "oj", English: "Ojibwa"},
|
||||
{Alpha3bCode: "ori", Alpha2Code: "or", English: "Oriya"},
|
||||
{Alpha3bCode: "orm", Alpha2Code: "om", English: "Oromo"},
|
||||
{Alpha3bCode: "oss", Alpha2Code: "os", English: "Ossetian; Ossetic"},
|
||||
{Alpha3bCode: "pan", Alpha2Code: "pa", English: "Panjabi; Punjabi"},
|
||||
{Alpha3bCode: "per", Alpha2Code: "fa", English: "Persian"},
|
||||
{Alpha3bCode: "pli", Alpha2Code: "pi", English: "Pali"},
|
||||
{Alpha3bCode: "pol", Alpha2Code: "pl", English: "Polish"},
|
||||
{Alpha3bCode: "por", Alpha2Code: "pt", English: "Portuguese"},
|
||||
{Alpha3bCode: "pus", Alpha2Code: "ps", English: "Pushto; Pashto"},
|
||||
{Alpha3bCode: "que", Alpha2Code: "qu", English: "Quechua"},
|
||||
{Alpha3bCode: "roh", Alpha2Code: "rm", English: "Romansh"},
|
||||
{Alpha3bCode: "rum", Alpha2Code: "ro", English: "Romanian; Moldavian; Moldovan"},
|
||||
{Alpha3bCode: "run", Alpha2Code: "rn", English: "Rundi"},
|
||||
{Alpha3bCode: "rus", Alpha2Code: "ru", English: "Russian"},
|
||||
{Alpha3bCode: "sag", Alpha2Code: "sg", English: "Sango"},
|
||||
{Alpha3bCode: "san", Alpha2Code: "sa", English: "Sanskrit"},
|
||||
{Alpha3bCode: "sin", Alpha2Code: "si", English: "Sinhala; Sinhalese"},
|
||||
{Alpha3bCode: "slo", Alpha2Code: "sk", English: "Slovak"},
|
||||
{Alpha3bCode: "slv", Alpha2Code: "sl", English: "Slovenian"},
|
||||
{Alpha3bCode: "sme", Alpha2Code: "se", English: "Northern Sami"},
|
||||
{Alpha3bCode: "smo", Alpha2Code: "sm", English: "Samoan"},
|
||||
{Alpha3bCode: "sna", Alpha2Code: "sn", English: "Shona"},
|
||||
{Alpha3bCode: "snd", Alpha2Code: "sd", English: "Sindhi"},
|
||||
{Alpha3bCode: "som", Alpha2Code: "so", English: "Somali"},
|
||||
{Alpha3bCode: "sot", Alpha2Code: "st", English: "Sotho, Southern"},
|
||||
{Alpha3bCode: "spa", Alpha2Code: "es", English: "Spanish; Castilian"},
|
||||
{Alpha3bCode: "srd", Alpha2Code: "sc", English: "Sardinian"},
|
||||
{Alpha3bCode: "srp", Alpha2Code: "sr", English: "Serbian"},
|
||||
{Alpha3bCode: "ssw", Alpha2Code: "ss", English: "Swati"},
|
||||
{Alpha3bCode: "sun", Alpha2Code: "su", English: "Sundanese"},
|
||||
{Alpha3bCode: "swa", Alpha2Code: "sw", English: "Swahili"},
|
||||
{Alpha3bCode: "swe", Alpha2Code: "sv", English: "Swedish"},
|
||||
{Alpha3bCode: "tah", Alpha2Code: "ty", English: "Tahitian"},
|
||||
{Alpha3bCode: "tam", Alpha2Code: "ta", English: "Tamil"},
|
||||
{Alpha3bCode: "tat", Alpha2Code: "tt", English: "Tatar"},
|
||||
{Alpha3bCode: "tel", Alpha2Code: "te", English: "Telugu"},
|
||||
{Alpha3bCode: "tgk", Alpha2Code: "tg", English: "Tajik"},
|
||||
{Alpha3bCode: "tgl", Alpha2Code: "tl", English: "Tagalog"},
|
||||
{Alpha3bCode: "tha", Alpha2Code: "th", English: "Thai"},
|
||||
{Alpha3bCode: "tib", Alpha2Code: "bo", English: "Tibetan"},
|
||||
{Alpha3bCode: "tir", Alpha2Code: "ti", English: "Tigrinya"},
|
||||
{Alpha3bCode: "ton", Alpha2Code: "to", English: "Tonga (Tonga Islands)"},
|
||||
{Alpha3bCode: "tsn", Alpha2Code: "tn", English: "Tswana"},
|
||||
{Alpha3bCode: "tso", Alpha2Code: "ts", English: "Tsonga"},
|
||||
{Alpha3bCode: "tuk", Alpha2Code: "tk", English: "Turkmen"},
|
||||
{Alpha3bCode: "tur", Alpha2Code: "tr", English: "Turkish"},
|
||||
{Alpha3bCode: "twi", Alpha2Code: "tw", English: "Twi"},
|
||||
{Alpha3bCode: "uig", Alpha2Code: "ug", English: "Uighur; Uyghur"},
|
||||
{Alpha3bCode: "ukr", Alpha2Code: "uk", English: "Ukrainian"},
|
||||
{Alpha3bCode: "urd", Alpha2Code: "ur", English: "Urdu"},
|
||||
{Alpha3bCode: "uzb", Alpha2Code: "uz", English: "Uzbek"},
|
||||
{Alpha3bCode: "ven", Alpha2Code: "ve", English: "Venda"},
|
||||
{Alpha3bCode: "vie", Alpha2Code: "vi", English: "Vietnamese"},
|
||||
{Alpha3bCode: "vol", Alpha2Code: "vo", English: "Volapük"},
|
||||
{Alpha3bCode: "wel", Alpha2Code: "cy", English: "Welsh"},
|
||||
{Alpha3bCode: "wln", Alpha2Code: "wa", English: "Walloon"},
|
||||
{Alpha3bCode: "wol", Alpha2Code: "wo", English: "Wolof"},
|
||||
{Alpha3bCode: "xho", Alpha2Code: "xh", English: "Xhosa"},
|
||||
{Alpha3bCode: "yid", Alpha2Code: "yi", English: "Yiddish"},
|
||||
{Alpha3bCode: "yor", Alpha2Code: "yo", English: "Yoruba"},
|
||||
{Alpha3bCode: "zha", Alpha2Code: "za", English: "Zhuang; Chuang"},
|
||||
{Alpha3bCode: "zul", Alpha2Code: "zu", English: "Zulu"},
|
||||
}
|
||||
270
vendor/github.com/asaskevich/govalidator/utils.go
generated
vendored
Normal file
270
vendor/github.com/asaskevich/govalidator/utils.go
generated
vendored
Normal file
|
|
@ -0,0 +1,270 @@
|
|||
package govalidator
|
||||
|
||||
import (
|
||||
"errors"
|
||||
"fmt"
|
||||
"html"
|
||||
"math"
|
||||
"path"
|
||||
"regexp"
|
||||
"strings"
|
||||
"unicode"
|
||||
"unicode/utf8"
|
||||
)
|
||||
|
||||
// Contains checks if the string contains the substring.
|
||||
func Contains(str, substring string) bool {
|
||||
return strings.Contains(str, substring)
|
||||
}
|
||||
|
||||
// Matches checks if string matches the pattern (pattern is regular expression)
|
||||
// In case of error return false
|
||||
func Matches(str, pattern string) bool {
|
||||
match, _ := regexp.MatchString(pattern, str)
|
||||
return match
|
||||
}
|
||||
|
||||
// LeftTrim trims characters from the left side of the input.
|
||||
// If second argument is empty, it will remove leading spaces.
|
||||
func LeftTrim(str, chars string) string {
|
||||
if chars == "" {
|
||||
return strings.TrimLeftFunc(str, unicode.IsSpace)
|
||||
}
|
||||
r, _ := regexp.Compile("^[" + chars + "]+")
|
||||
return r.ReplaceAllString(str, "")
|
||||
}
|
||||
|
||||
// RightTrim trims characters from the right side of the input.
|
||||
// If second argument is empty, it will remove trailing spaces.
|
||||
func RightTrim(str, chars string) string {
|
||||
if chars == "" {
|
||||
return strings.TrimRightFunc(str, unicode.IsSpace)
|
||||
}
|
||||
r, _ := regexp.Compile("[" + chars + "]+$")
|
||||
return r.ReplaceAllString(str, "")
|
||||
}
|
||||
|
||||
// Trim trims characters from both sides of the input.
|
||||
// If second argument is empty, it will remove spaces.
|
||||
func Trim(str, chars string) string {
|
||||
return LeftTrim(RightTrim(str, chars), chars)
|
||||
}
|
||||
|
||||
// WhiteList removes characters that do not appear in the whitelist.
|
||||
func WhiteList(str, chars string) string {
|
||||
pattern := "[^" + chars + "]+"
|
||||
r, _ := regexp.Compile(pattern)
|
||||
return r.ReplaceAllString(str, "")
|
||||
}
|
||||
|
||||
// BlackList removes characters that appear in the blacklist.
|
||||
func BlackList(str, chars string) string {
|
||||
pattern := "[" + chars + "]+"
|
||||
r, _ := regexp.Compile(pattern)
|
||||
return r.ReplaceAllString(str, "")
|
||||
}
|
||||
|
||||
// StripLow removes characters with a numerical value < 32 and 127, mostly control characters.
|
||||
// If keep_new_lines is true, newline characters are preserved (\n and \r, hex 0xA and 0xD).
|
||||
func StripLow(str string, keepNewLines bool) string {
|
||||
chars := ""
|
||||
if keepNewLines {
|
||||
chars = "\x00-\x09\x0B\x0C\x0E-\x1F\x7F"
|
||||
} else {
|
||||
chars = "\x00-\x1F\x7F"
|
||||
}
|
||||
return BlackList(str, chars)
|
||||
}
|
||||
|
||||
// ReplacePattern replaces regular expression pattern in string
|
||||
func ReplacePattern(str, pattern, replace string) string {
|
||||
r, _ := regexp.Compile(pattern)
|
||||
return r.ReplaceAllString(str, replace)
|
||||
}
|
||||
|
||||
// Escape replaces <, >, & and " with HTML entities.
|
||||
var Escape = html.EscapeString
|
||||
|
||||
func addSegment(inrune, segment []rune) []rune {
|
||||
if len(segment) == 0 {
|
||||
return inrune
|
||||
}
|
||||
if len(inrune) != 0 {
|
||||
inrune = append(inrune, '_')
|
||||
}
|
||||
inrune = append(inrune, segment...)
|
||||
return inrune
|
||||
}
|
||||
|
||||
// UnderscoreToCamelCase converts from underscore separated form to camel case form.
|
||||
// Ex.: my_func => MyFunc
|
||||
func UnderscoreToCamelCase(s string) string {
|
||||
return strings.Replace(strings.Title(strings.Replace(strings.ToLower(s), "_", " ", -1)), " ", "", -1)
|
||||
}
|
||||
|
||||
// CamelCaseToUnderscore converts from camel case form to underscore separated form.
|
||||
// Ex.: MyFunc => my_func
|
||||
func CamelCaseToUnderscore(str string) string {
|
||||
var output []rune
|
||||
var segment []rune
|
||||
for _, r := range str {
|
||||
|
||||
// not treat number as separate segment
|
||||
if !unicode.IsLower(r) && string(r) != "_" && !unicode.IsNumber(r) {
|
||||
output = addSegment(output, segment)
|
||||
segment = nil
|
||||
}
|
||||
segment = append(segment, unicode.ToLower(r))
|
||||
}
|
||||
output = addSegment(output, segment)
|
||||
return string(output)
|
||||
}
|
||||
|
||||
// Reverse returns reversed string
|
||||
func Reverse(s string) string {
|
||||
r := []rune(s)
|
||||
for i, j := 0, len(r)-1; i < j; i, j = i+1, j-1 {
|
||||
r[i], r[j] = r[j], r[i]
|
||||
}
|
||||
return string(r)
|
||||
}
|
||||
|
||||
// GetLines splits string by "\n" and return array of lines
|
||||
func GetLines(s string) []string {
|
||||
return strings.Split(s, "\n")
|
||||
}
|
||||
|
||||
// GetLine returns specified line of multiline string
|
||||
func GetLine(s string, index int) (string, error) {
|
||||
lines := GetLines(s)
|
||||
if index < 0 || index >= len(lines) {
|
||||
return "", errors.New("line index out of bounds")
|
||||
}
|
||||
return lines[index], nil
|
||||
}
|
||||
|
||||
// RemoveTags removes all tags from HTML string
|
||||
func RemoveTags(s string) string {
|
||||
return ReplacePattern(s, "<[^>]*>", "")
|
||||
}
|
||||
|
||||
// SafeFileName returns safe string that can be used in file names
|
||||
func SafeFileName(str string) string {
|
||||
name := strings.ToLower(str)
|
||||
name = path.Clean(path.Base(name))
|
||||
name = strings.Trim(name, " ")
|
||||
separators, err := regexp.Compile(`[ &_=+:]`)
|
||||
if err == nil {
|
||||
name = separators.ReplaceAllString(name, "-")
|
||||
}
|
||||
legal, err := regexp.Compile(`[^[:alnum:]-.]`)
|
||||
if err == nil {
|
||||
name = legal.ReplaceAllString(name, "")
|
||||
}
|
||||
for strings.Contains(name, "--") {
|
||||
name = strings.Replace(name, "--", "-", -1)
|
||||
}
|
||||
return name
|
||||
}
|
||||
|
||||
// NormalizeEmail canonicalize an email address.
|
||||
// The local part of the email address is lowercased for all domains; the hostname is always lowercased and
|
||||
// the local part of the email address is always lowercased for hosts that are known to be case-insensitive (currently only GMail).
|
||||
// Normalization follows special rules for known providers: currently, GMail addresses have dots removed in the local part and
|
||||
// are stripped of tags (e.g. some.one+tag@gmail.com becomes someone@gmail.com) and all @googlemail.com addresses are
|
||||
// normalized to @gmail.com.
|
||||
func NormalizeEmail(str string) (string, error) {
|
||||
if !IsEmail(str) {
|
||||
return "", fmt.Errorf("%s is not an email", str)
|
||||
}
|
||||
parts := strings.Split(str, "@")
|
||||
parts[0] = strings.ToLower(parts[0])
|
||||
parts[1] = strings.ToLower(parts[1])
|
||||
if parts[1] == "gmail.com" || parts[1] == "googlemail.com" {
|
||||
parts[1] = "gmail.com"
|
||||
parts[0] = strings.Split(ReplacePattern(parts[0], `\.`, ""), "+")[0]
|
||||
}
|
||||
return strings.Join(parts, "@"), nil
|
||||
}
|
||||
|
||||
// Truncate a string to the closest length without breaking words.
|
||||
func Truncate(str string, length int, ending string) string {
|
||||
var aftstr, befstr string
|
||||
if len(str) > length {
|
||||
words := strings.Fields(str)
|
||||
before, present := 0, 0
|
||||
for i := range words {
|
||||
befstr = aftstr
|
||||
before = present
|
||||
aftstr = aftstr + words[i] + " "
|
||||
present = len(aftstr)
|
||||
if present > length && i != 0 {
|
||||
if (length - before) < (present - length) {
|
||||
return Trim(befstr, " /\\.,\"'#!?&@+-") + ending
|
||||
}
|
||||
return Trim(aftstr, " /\\.,\"'#!?&@+-") + ending
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return str
|
||||
}
|
||||
|
||||
// PadLeft pads left side of a string if size of string is less then indicated pad length
|
||||
func PadLeft(str string, padStr string, padLen int) string {
|
||||
return buildPadStr(str, padStr, padLen, true, false)
|
||||
}
|
||||
|
||||
// PadRight pads right side of a string if size of string is less then indicated pad length
|
||||
func PadRight(str string, padStr string, padLen int) string {
|
||||
return buildPadStr(str, padStr, padLen, false, true)
|
||||
}
|
||||
|
||||
// PadBoth pads both sides of a string if size of string is less then indicated pad length
|
||||
func PadBoth(str string, padStr string, padLen int) string {
|
||||
return buildPadStr(str, padStr, padLen, true, true)
|
||||
}
|
||||
|
||||
// PadString either left, right or both sides.
|
||||
// Note that padding string can be unicode and more then one character
|
||||
func buildPadStr(str string, padStr string, padLen int, padLeft bool, padRight bool) string {
|
||||
|
||||
// When padded length is less then the current string size
|
||||
if padLen < utf8.RuneCountInString(str) {
|
||||
return str
|
||||
}
|
||||
|
||||
padLen -= utf8.RuneCountInString(str)
|
||||
|
||||
targetLen := padLen
|
||||
|
||||
targetLenLeft := targetLen
|
||||
targetLenRight := targetLen
|
||||
if padLeft && padRight {
|
||||
targetLenLeft = padLen / 2
|
||||
targetLenRight = padLen - targetLenLeft
|
||||
}
|
||||
|
||||
strToRepeatLen := utf8.RuneCountInString(padStr)
|
||||
|
||||
repeatTimes := int(math.Ceil(float64(targetLen) / float64(strToRepeatLen)))
|
||||
repeatedString := strings.Repeat(padStr, repeatTimes)
|
||||
|
||||
leftSide := ""
|
||||
if padLeft {
|
||||
leftSide = repeatedString[0:targetLenLeft]
|
||||
}
|
||||
|
||||
rightSide := ""
|
||||
if padRight {
|
||||
rightSide = repeatedString[0:targetLenRight]
|
||||
}
|
||||
|
||||
return leftSide + str + rightSide
|
||||
}
|
||||
|
||||
// TruncatingErrorf removes extra args from fmt.Errorf if not formatted in the str object
|
||||
func TruncatingErrorf(str string, args ...interface{}) error {
|
||||
n := strings.Count(str, "%s")
|
||||
return fmt.Errorf(str, args[:n]...)
|
||||
}
|
||||
1768
vendor/github.com/asaskevich/govalidator/validator.go
generated
vendored
Normal file
1768
vendor/github.com/asaskevich/govalidator/validator.go
generated
vendored
Normal file
File diff suppressed because it is too large
Load diff
15
vendor/github.com/asaskevich/govalidator/wercker.yml
generated
vendored
Normal file
15
vendor/github.com/asaskevich/govalidator/wercker.yml
generated
vendored
Normal file
|
|
@ -0,0 +1,15 @@
|
|||
box: golang
|
||||
build:
|
||||
steps:
|
||||
- setup-go-workspace
|
||||
|
||||
- script:
|
||||
name: go get
|
||||
code: |
|
||||
go version
|
||||
go get -t ./...
|
||||
|
||||
- script:
|
||||
name: go test
|
||||
code: |
|
||||
go test -race -v ./...
|
||||
70
vendor/github.com/docker/go-units/size.go
generated
vendored
70
vendor/github.com/docker/go-units/size.go
generated
vendored
|
|
@ -2,7 +2,6 @@ package units
|
|||
|
||||
import (
|
||||
"fmt"
|
||||
"regexp"
|
||||
"strconv"
|
||||
"strings"
|
||||
)
|
||||
|
|
@ -26,16 +25,17 @@ const (
|
|||
PiB = 1024 * TiB
|
||||
)
|
||||
|
||||
type unitMap map[string]int64
|
||||
type unitMap map[byte]int64
|
||||
|
||||
var (
|
||||
decimalMap = unitMap{"k": KB, "m": MB, "g": GB, "t": TB, "p": PB}
|
||||
binaryMap = unitMap{"k": KiB, "m": MiB, "g": GiB, "t": TiB, "p": PiB}
|
||||
sizeRegex = regexp.MustCompile(`^(\d+(\.\d+)*) ?([kKmMgGtTpP])?[iI]?[bB]?$`)
|
||||
decimalMap = unitMap{'k': KB, 'm': MB, 'g': GB, 't': TB, 'p': PB}
|
||||
binaryMap = unitMap{'k': KiB, 'm': MiB, 'g': GiB, 't': TiB, 'p': PiB}
|
||||
)
|
||||
|
||||
var decimapAbbrs = []string{"B", "kB", "MB", "GB", "TB", "PB", "EB", "ZB", "YB"}
|
||||
var binaryAbbrs = []string{"B", "KiB", "MiB", "GiB", "TiB", "PiB", "EiB", "ZiB", "YiB"}
|
||||
var (
|
||||
decimapAbbrs = []string{"B", "kB", "MB", "GB", "TB", "PB", "EB", "ZB", "YB"}
|
||||
binaryAbbrs = []string{"B", "KiB", "MiB", "GiB", "TiB", "PiB", "EiB", "ZiB", "YiB"}
|
||||
)
|
||||
|
||||
func getSizeAndUnit(size float64, base float64, _map []string) (float64, string) {
|
||||
i := 0
|
||||
|
|
@ -89,20 +89,66 @@ func RAMInBytes(size string) (int64, error) {
|
|||
|
||||
// Parses the human-readable size string into the amount it represents.
|
||||
func parseSize(sizeStr string, uMap unitMap) (int64, error) {
|
||||
matches := sizeRegex.FindStringSubmatch(sizeStr)
|
||||
if len(matches) != 4 {
|
||||
// TODO: rewrite to use strings.Cut if there's a space
|
||||
// once Go < 1.18 is deprecated.
|
||||
sep := strings.LastIndexAny(sizeStr, "01234567890. ")
|
||||
if sep == -1 {
|
||||
// There should be at least a digit.
|
||||
return -1, fmt.Errorf("invalid size: '%s'", sizeStr)
|
||||
}
|
||||
var num, sfx string
|
||||
if sizeStr[sep] != ' ' {
|
||||
num = sizeStr[:sep+1]
|
||||
sfx = sizeStr[sep+1:]
|
||||
} else {
|
||||
// Omit the space separator.
|
||||
num = sizeStr[:sep]
|
||||
sfx = sizeStr[sep+1:]
|
||||
}
|
||||
|
||||
size, err := strconv.ParseFloat(matches[1], 64)
|
||||
size, err := strconv.ParseFloat(num, 64)
|
||||
if err != nil {
|
||||
return -1, err
|
||||
}
|
||||
// Backward compatibility: reject negative sizes.
|
||||
if size < 0 {
|
||||
return -1, fmt.Errorf("invalid size: '%s'", sizeStr)
|
||||
}
|
||||
|
||||
unitPrefix := strings.ToLower(matches[3])
|
||||
if mul, ok := uMap[unitPrefix]; ok {
|
||||
if len(sfx) == 0 {
|
||||
return int64(size), nil
|
||||
}
|
||||
|
||||
// Process the suffix.
|
||||
|
||||
if len(sfx) > 3 { // Too long.
|
||||
goto badSuffix
|
||||
}
|
||||
sfx = strings.ToLower(sfx)
|
||||
// Trivial case: b suffix.
|
||||
if sfx[0] == 'b' {
|
||||
if len(sfx) > 1 { // no extra characters allowed after b.
|
||||
goto badSuffix
|
||||
}
|
||||
return int64(size), nil
|
||||
}
|
||||
// A suffix from the map.
|
||||
if mul, ok := uMap[sfx[0]]; ok {
|
||||
size *= float64(mul)
|
||||
} else {
|
||||
goto badSuffix
|
||||
}
|
||||
|
||||
// The suffix may have extra "b" or "ib" (e.g. KiB or MB).
|
||||
switch {
|
||||
case len(sfx) == 2 && sfx[1] != 'b':
|
||||
goto badSuffix
|
||||
case len(sfx) == 3 && sfx[1:] != "ib":
|
||||
goto badSuffix
|
||||
}
|
||||
|
||||
return int64(size), nil
|
||||
|
||||
badSuffix:
|
||||
return -1, fmt.Errorf("invalid suffix: '%s'", sfx)
|
||||
}
|
||||
|
|
|
|||
0
vendor/github.com/felixge/httpsnoop/.gitignore
generated
vendored
Normal file
0
vendor/github.com/felixge/httpsnoop/.gitignore
generated
vendored
Normal file
6
vendor/github.com/felixge/httpsnoop/.travis.yml
generated
vendored
Normal file
6
vendor/github.com/felixge/httpsnoop/.travis.yml
generated
vendored
Normal file
|
|
@ -0,0 +1,6 @@
|
|||
language: go
|
||||
|
||||
go:
|
||||
- 1.6
|
||||
- 1.7
|
||||
- 1.8
|
||||
19
vendor/github.com/felixge/httpsnoop/LICENSE.txt
generated
vendored
Normal file
19
vendor/github.com/felixge/httpsnoop/LICENSE.txt
generated
vendored
Normal file
|
|
@ -0,0 +1,19 @@
|
|||
Copyright (c) 2016 Felix Geisendörfer (felix@debuggable.com)
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in
|
||||
all copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||
THE SOFTWARE.
|
||||
10
vendor/github.com/felixge/httpsnoop/Makefile
generated
vendored
Normal file
10
vendor/github.com/felixge/httpsnoop/Makefile
generated
vendored
Normal file
|
|
@ -0,0 +1,10 @@
|
|||
.PHONY: ci generate clean
|
||||
|
||||
ci: clean generate
|
||||
go test -v ./...
|
||||
|
||||
generate:
|
||||
go generate .
|
||||
|
||||
clean:
|
||||
rm -rf *_generated*.go
|
||||
95
vendor/github.com/felixge/httpsnoop/README.md
generated
vendored
Normal file
95
vendor/github.com/felixge/httpsnoop/README.md
generated
vendored
Normal file
|
|
@ -0,0 +1,95 @@
|
|||
# httpsnoop
|
||||
|
||||
Package httpsnoop provides an easy way to capture http related metrics (i.e.
|
||||
response time, bytes written, and http status code) from your application's
|
||||
http.Handlers.
|
||||
|
||||
Doing this requires non-trivial wrapping of the http.ResponseWriter interface,
|
||||
which is also exposed for users interested in a more low-level API.
|
||||
|
||||
[](https://godoc.org/github.com/felixge/httpsnoop)
|
||||
[](https://travis-ci.org/felixge/httpsnoop)
|
||||
|
||||
## Usage Example
|
||||
|
||||
```go
|
||||
// myH is your app's http handler, perhaps a http.ServeMux or similar.
|
||||
var myH http.Handler
|
||||
// wrappedH wraps myH in order to log every request.
|
||||
wrappedH := http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
|
||||
m := httpsnoop.CaptureMetrics(myH, w, r)
|
||||
log.Printf(
|
||||
"%s %s (code=%d dt=%s written=%d)",
|
||||
r.Method,
|
||||
r.URL,
|
||||
m.Code,
|
||||
m.Duration,
|
||||
m.Written,
|
||||
)
|
||||
})
|
||||
http.ListenAndServe(":8080", wrappedH)
|
||||
```
|
||||
|
||||
## Why this package exists
|
||||
|
||||
Instrumenting an application's http.Handler is surprisingly difficult.
|
||||
|
||||
However if you google for e.g. "capture ResponseWriter status code" you'll find
|
||||
lots of advise and code examples that suggest it to be a fairly trivial
|
||||
undertaking. Unfortunately everything I've seen so far has a high chance of
|
||||
breaking your application.
|
||||
|
||||
The main problem is that a `http.ResponseWriter` often implements additional
|
||||
interfaces such as `http.Flusher`, `http.CloseNotifier`, `http.Hijacker`, `http.Pusher`, and
|
||||
`io.ReaderFrom`. So the naive approach of just wrapping `http.ResponseWriter`
|
||||
in your own struct that also implements the `http.ResponseWriter` interface
|
||||
will hide the additional interfaces mentioned above. This has a high change of
|
||||
introducing subtle bugs into any non-trivial application.
|
||||
|
||||
Another approach I've seen people take is to return a struct that implements
|
||||
all of the interfaces above. However, that's also problematic, because it's
|
||||
difficult to fake some of these interfaces behaviors when the underlying
|
||||
`http.ResponseWriter` doesn't have an implementation. It's also dangerous,
|
||||
because an application may choose to operate differently, merely because it
|
||||
detects the presence of these additional interfaces.
|
||||
|
||||
This package solves this problem by checking which additional interfaces a
|
||||
`http.ResponseWriter` implements, returning a wrapped version implementing the
|
||||
exact same set of interfaces.
|
||||
|
||||
Additionally this package properly handles edge cases such as `WriteHeader` not
|
||||
being called, or called more than once, as well as concurrent calls to
|
||||
`http.ResponseWriter` methods, and even calls happening after the wrapped
|
||||
`ServeHTTP` has already returned.
|
||||
|
||||
Unfortunately this package is not perfect either. It's possible that it is
|
||||
still missing some interfaces provided by the go core (let me know if you find
|
||||
one), and it won't work for applications adding their own interfaces into the
|
||||
mix. You can however use `httpsnoop.Unwrap(w)` to access the underlying
|
||||
`http.ResponseWriter` and type-assert the result to its other interfaces.
|
||||
|
||||
However, hopefully the explanation above has sufficiently scared you of rolling
|
||||
your own solution to this problem. httpsnoop may still break your application,
|
||||
but at least it tries to avoid it as much as possible.
|
||||
|
||||
Anyway, the real problem here is that smuggling additional interfaces inside
|
||||
`http.ResponseWriter` is a problematic design choice, but it probably goes as
|
||||
deep as the Go language specification itself. But that's okay, I still prefer
|
||||
Go over the alternatives ;).
|
||||
|
||||
## Performance
|
||||
|
||||
```
|
||||
BenchmarkBaseline-8 20000 94912 ns/op
|
||||
BenchmarkCaptureMetrics-8 20000 95461 ns/op
|
||||
```
|
||||
|
||||
As you can see, using `CaptureMetrics` on a vanilla http.Handler introduces an
|
||||
overhead of ~500 ns per http request on my machine. However, the margin of
|
||||
error appears to be larger than that, therefor it should be reasonable to
|
||||
assume that the overhead introduced by `CaptureMetrics` is absolutely
|
||||
negligible.
|
||||
|
||||
## License
|
||||
|
||||
MIT
|
||||
86
vendor/github.com/felixge/httpsnoop/capture_metrics.go
generated
vendored
Normal file
86
vendor/github.com/felixge/httpsnoop/capture_metrics.go
generated
vendored
Normal file
|
|
@ -0,0 +1,86 @@
|
|||
package httpsnoop
|
||||
|
||||
import (
|
||||
"io"
|
||||
"net/http"
|
||||
"time"
|
||||
)
|
||||
|
||||
// Metrics holds metrics captured from CaptureMetrics.
|
||||
type Metrics struct {
|
||||
// Code is the first http response code passed to the WriteHeader func of
|
||||
// the ResponseWriter. If no such call is made, a default code of 200 is
|
||||
// assumed instead.
|
||||
Code int
|
||||
// Duration is the time it took to execute the handler.
|
||||
Duration time.Duration
|
||||
// Written is the number of bytes successfully written by the Write or
|
||||
// ReadFrom function of the ResponseWriter. ResponseWriters may also write
|
||||
// data to their underlaying connection directly (e.g. headers), but those
|
||||
// are not tracked. Therefor the number of Written bytes will usually match
|
||||
// the size of the response body.
|
||||
Written int64
|
||||
}
|
||||
|
||||
// CaptureMetrics wraps the given hnd, executes it with the given w and r, and
|
||||
// returns the metrics it captured from it.
|
||||
func CaptureMetrics(hnd http.Handler, w http.ResponseWriter, r *http.Request) Metrics {
|
||||
return CaptureMetricsFn(w, func(ww http.ResponseWriter) {
|
||||
hnd.ServeHTTP(ww, r)
|
||||
})
|
||||
}
|
||||
|
||||
// CaptureMetricsFn wraps w and calls fn with the wrapped w and returns the
|
||||
// resulting metrics. This is very similar to CaptureMetrics (which is just
|
||||
// sugar on top of this func), but is a more usable interface if your
|
||||
// application doesn't use the Go http.Handler interface.
|
||||
func CaptureMetricsFn(w http.ResponseWriter, fn func(http.ResponseWriter)) Metrics {
|
||||
m := Metrics{Code: http.StatusOK}
|
||||
m.CaptureMetrics(w, fn)
|
||||
return m
|
||||
}
|
||||
|
||||
// CaptureMetrics wraps w and calls fn with the wrapped w and updates
|
||||
// Metrics m with the resulting metrics. This is similar to CaptureMetricsFn,
|
||||
// but allows one to customize starting Metrics object.
|
||||
func (m *Metrics) CaptureMetrics(w http.ResponseWriter, fn func(http.ResponseWriter)) {
|
||||
var (
|
||||
start = time.Now()
|
||||
headerWritten bool
|
||||
hooks = Hooks{
|
||||
WriteHeader: func(next WriteHeaderFunc) WriteHeaderFunc {
|
||||
return func(code int) {
|
||||
next(code)
|
||||
|
||||
if !headerWritten {
|
||||
m.Code = code
|
||||
headerWritten = true
|
||||
}
|
||||
}
|
||||
},
|
||||
|
||||
Write: func(next WriteFunc) WriteFunc {
|
||||
return func(p []byte) (int, error) {
|
||||
n, err := next(p)
|
||||
|
||||
m.Written += int64(n)
|
||||
headerWritten = true
|
||||
return n, err
|
||||
}
|
||||
},
|
||||
|
||||
ReadFrom: func(next ReadFromFunc) ReadFromFunc {
|
||||
return func(src io.Reader) (int64, error) {
|
||||
n, err := next(src)
|
||||
|
||||
headerWritten = true
|
||||
m.Written += n
|
||||
return n, err
|
||||
}
|
||||
},
|
||||
}
|
||||
)
|
||||
|
||||
fn(Wrap(w, hooks))
|
||||
m.Duration += time.Since(start)
|
||||
}
|
||||
10
vendor/github.com/felixge/httpsnoop/docs.go
generated
vendored
Normal file
10
vendor/github.com/felixge/httpsnoop/docs.go
generated
vendored
Normal file
|
|
@ -0,0 +1,10 @@
|
|||
// Package httpsnoop provides an easy way to capture http related metrics (i.e.
|
||||
// response time, bytes written, and http status code) from your application's
|
||||
// http.Handlers.
|
||||
//
|
||||
// Doing this requires non-trivial wrapping of the http.ResponseWriter
|
||||
// interface, which is also exposed for users interested in a more low-level
|
||||
// API.
|
||||
package httpsnoop
|
||||
|
||||
//go:generate go run codegen/main.go
|
||||
436
vendor/github.com/felixge/httpsnoop/wrap_generated_gteq_1.8.go
generated
vendored
Normal file
436
vendor/github.com/felixge/httpsnoop/wrap_generated_gteq_1.8.go
generated
vendored
Normal file
|
|
@ -0,0 +1,436 @@
|
|||
// +build go1.8
|
||||
// Code generated by "httpsnoop/codegen"; DO NOT EDIT
|
||||
|
||||
package httpsnoop
|
||||
|
||||
import (
|
||||
"bufio"
|
||||
"io"
|
||||
"net"
|
||||
"net/http"
|
||||
)
|
||||
|
||||
// HeaderFunc is part of the http.ResponseWriter interface.
|
||||
type HeaderFunc func() http.Header
|
||||
|
||||
// WriteHeaderFunc is part of the http.ResponseWriter interface.
|
||||
type WriteHeaderFunc func(code int)
|
||||
|
||||
// WriteFunc is part of the http.ResponseWriter interface.
|
||||
type WriteFunc func(b []byte) (int, error)
|
||||
|
||||
// FlushFunc is part of the http.Flusher interface.
|
||||
type FlushFunc func()
|
||||
|
||||
// CloseNotifyFunc is part of the http.CloseNotifier interface.
|
||||
type CloseNotifyFunc func() <-chan bool
|
||||
|
||||
// HijackFunc is part of the http.Hijacker interface.
|
||||
type HijackFunc func() (net.Conn, *bufio.ReadWriter, error)
|
||||
|
||||
// ReadFromFunc is part of the io.ReaderFrom interface.
|
||||
type ReadFromFunc func(src io.Reader) (int64, error)
|
||||
|
||||
// PushFunc is part of the http.Pusher interface.
|
||||
type PushFunc func(target string, opts *http.PushOptions) error
|
||||
|
||||
// Hooks defines a set of method interceptors for methods included in
|
||||
// http.ResponseWriter as well as some others. You can think of them as
|
||||
// middleware for the function calls they target. See Wrap for more details.
|
||||
type Hooks struct {
|
||||
Header func(HeaderFunc) HeaderFunc
|
||||
WriteHeader func(WriteHeaderFunc) WriteHeaderFunc
|
||||
Write func(WriteFunc) WriteFunc
|
||||
Flush func(FlushFunc) FlushFunc
|
||||
CloseNotify func(CloseNotifyFunc) CloseNotifyFunc
|
||||
Hijack func(HijackFunc) HijackFunc
|
||||
ReadFrom func(ReadFromFunc) ReadFromFunc
|
||||
Push func(PushFunc) PushFunc
|
||||
}
|
||||
|
||||
// Wrap returns a wrapped version of w that provides the exact same interface
|
||||
// as w. Specifically if w implements any combination of:
|
||||
//
|
||||
// - http.Flusher
|
||||
// - http.CloseNotifier
|
||||
// - http.Hijacker
|
||||
// - io.ReaderFrom
|
||||
// - http.Pusher
|
||||
//
|
||||
// The wrapped version will implement the exact same combination. If no hooks
|
||||
// are set, the wrapped version also behaves exactly as w. Hooks targeting
|
||||
// methods not supported by w are ignored. Any other hooks will intercept the
|
||||
// method they target and may modify the call's arguments and/or return values.
|
||||
// The CaptureMetrics implementation serves as a working example for how the
|
||||
// hooks can be used.
|
||||
func Wrap(w http.ResponseWriter, hooks Hooks) http.ResponseWriter {
|
||||
rw := &rw{w: w, h: hooks}
|
||||
_, i0 := w.(http.Flusher)
|
||||
_, i1 := w.(http.CloseNotifier)
|
||||
_, i2 := w.(http.Hijacker)
|
||||
_, i3 := w.(io.ReaderFrom)
|
||||
_, i4 := w.(http.Pusher)
|
||||
switch {
|
||||
// combination 1/32
|
||||
case !i0 && !i1 && !i2 && !i3 && !i4:
|
||||
return struct {
|
||||
Unwrapper
|
||||
http.ResponseWriter
|
||||
}{rw, rw}
|
||||
// combination 2/32
|
||||
case !i0 && !i1 && !i2 && !i3 && i4:
|
||||
return struct {
|
||||
Unwrapper
|
||||
http.ResponseWriter
|
||||
http.Pusher
|
||||
}{rw, rw, rw}
|
||||
// combination 3/32
|
||||
case !i0 && !i1 && !i2 && i3 && !i4:
|
||||
return struct {
|
||||
Unwrapper
|
||||
http.ResponseWriter
|
||||
io.ReaderFrom
|
||||
}{rw, rw, rw}
|
||||
// combination 4/32
|
||||
case !i0 && !i1 && !i2 && i3 && i4:
|
||||
return struct {
|
||||
Unwrapper
|
||||
http.ResponseWriter
|
||||
io.ReaderFrom
|
||||
http.Pusher
|
||||
}{rw, rw, rw, rw}
|
||||
// combination 5/32
|
||||
case !i0 && !i1 && i2 && !i3 && !i4:
|
||||
return struct {
|
||||
Unwrapper
|
||||
http.ResponseWriter
|
||||
http.Hijacker
|
||||
}{rw, rw, rw}
|
||||
// combination 6/32
|
||||
case !i0 && !i1 && i2 && !i3 && i4:
|
||||
return struct {
|
||||
Unwrapper
|
||||
http.ResponseWriter
|
||||
http.Hijacker
|
||||
http.Pusher
|
||||
}{rw, rw, rw, rw}
|
||||
// combination 7/32
|
||||
case !i0 && !i1 && i2 && i3 && !i4:
|
||||
return struct {
|
||||
Unwrapper
|
||||
http.ResponseWriter
|
||||
http.Hijacker
|
||||
io.ReaderFrom
|
||||
}{rw, rw, rw, rw}
|
||||
// combination 8/32
|
||||
case !i0 && !i1 && i2 && i3 && i4:
|
||||
return struct {
|
||||
Unwrapper
|
||||
http.ResponseWriter
|
||||
http.Hijacker
|
||||
io.ReaderFrom
|
||||
http.Pusher
|
||||
}{rw, rw, rw, rw, rw}
|
||||
// combination 9/32
|
||||
case !i0 && i1 && !i2 && !i3 && !i4:
|
||||
return struct {
|
||||
Unwrapper
|
||||
http.ResponseWriter
|
||||
http.CloseNotifier
|
||||
}{rw, rw, rw}
|
||||
// combination 10/32
|
||||
case !i0 && i1 && !i2 && !i3 && i4:
|
||||
return struct {
|
||||
Unwrapper
|
||||
http.ResponseWriter
|
||||
http.CloseNotifier
|
||||
http.Pusher
|
||||
}{rw, rw, rw, rw}
|
||||
// combination 11/32
|
||||
case !i0 && i1 && !i2 && i3 && !i4:
|
||||
return struct {
|
||||
Unwrapper
|
||||
http.ResponseWriter
|
||||
http.CloseNotifier
|
||||
io.ReaderFrom
|
||||
}{rw, rw, rw, rw}
|
||||
// combination 12/32
|
||||
case !i0 && i1 && !i2 && i3 && i4:
|
||||
return struct {
|
||||
Unwrapper
|
||||
http.ResponseWriter
|
||||
http.CloseNotifier
|
||||
io.ReaderFrom
|
||||
http.Pusher
|
||||
}{rw, rw, rw, rw, rw}
|
||||
// combination 13/32
|
||||
case !i0 && i1 && i2 && !i3 && !i4:
|
||||
return struct {
|
||||
Unwrapper
|
||||
http.ResponseWriter
|
||||
http.CloseNotifier
|
||||
http.Hijacker
|
||||
}{rw, rw, rw, rw}
|
||||
// combination 14/32
|
||||
case !i0 && i1 && i2 && !i3 && i4:
|
||||
return struct {
|
||||
Unwrapper
|
||||
http.ResponseWriter
|
||||
http.CloseNotifier
|
||||
http.Hijacker
|
||||
http.Pusher
|
||||
}{rw, rw, rw, rw, rw}
|
||||
// combination 15/32
|
||||
case !i0 && i1 && i2 && i3 && !i4:
|
||||
return struct {
|
||||
Unwrapper
|
||||
http.ResponseWriter
|
||||
http.CloseNotifier
|
||||
http.Hijacker
|
||||
io.ReaderFrom
|
||||
}{rw, rw, rw, rw, rw}
|
||||
// combination 16/32
|
||||
case !i0 && i1 && i2 && i3 && i4:
|
||||
return struct {
|
||||
Unwrapper
|
||||
http.ResponseWriter
|
||||
http.CloseNotifier
|
||||
http.Hijacker
|
||||
io.ReaderFrom
|
||||
http.Pusher
|
||||
}{rw, rw, rw, rw, rw, rw}
|
||||
// combination 17/32
|
||||
case i0 && !i1 && !i2 && !i3 && !i4:
|
||||
return struct {
|
||||
Unwrapper
|
||||
http.ResponseWriter
|
||||
http.Flusher
|
||||
}{rw, rw, rw}
|
||||
// combination 18/32
|
||||
case i0 && !i1 && !i2 && !i3 && i4:
|
||||
return struct {
|
||||
Unwrapper
|
||||
http.ResponseWriter
|
||||
http.Flusher
|
||||
http.Pusher
|
||||
}{rw, rw, rw, rw}
|
||||
// combination 19/32
|
||||
case i0 && !i1 && !i2 && i3 && !i4:
|
||||
return struct {
|
||||
Unwrapper
|
||||
http.ResponseWriter
|
||||
http.Flusher
|
||||
io.ReaderFrom
|
||||
}{rw, rw, rw, rw}
|
||||
// combination 20/32
|
||||
case i0 && !i1 && !i2 && i3 && i4:
|
||||
return struct {
|
||||
Unwrapper
|
||||
http.ResponseWriter
|
||||
http.Flusher
|
||||
io.ReaderFrom
|
||||
http.Pusher
|
||||
}{rw, rw, rw, rw, rw}
|
||||
// combination 21/32
|
||||
case i0 && !i1 && i2 && !i3 && !i4:
|
||||
return struct {
|
||||
Unwrapper
|
||||
http.ResponseWriter
|
||||
http.Flusher
|
||||
http.Hijacker
|
||||
}{rw, rw, rw, rw}
|
||||
// combination 22/32
|
||||
case i0 && !i1 && i2 && !i3 && i4:
|
||||
return struct {
|
||||
Unwrapper
|
||||
http.ResponseWriter
|
||||
http.Flusher
|
||||
http.Hijacker
|
||||
http.Pusher
|
||||
}{rw, rw, rw, rw, rw}
|
||||
// combination 23/32
|
||||
case i0 && !i1 && i2 && i3 && !i4:
|
||||
return struct {
|
||||
Unwrapper
|
||||
http.ResponseWriter
|
||||
http.Flusher
|
||||
http.Hijacker
|
||||
io.ReaderFrom
|
||||
}{rw, rw, rw, rw, rw}
|
||||
// combination 24/32
|
||||
case i0 && !i1 && i2 && i3 && i4:
|
||||
return struct {
|
||||
Unwrapper
|
||||
http.ResponseWriter
|
||||
http.Flusher
|
||||
http.Hijacker
|
||||
io.ReaderFrom
|
||||
http.Pusher
|
||||
}{rw, rw, rw, rw, rw, rw}
|
||||
// combination 25/32
|
||||
case i0 && i1 && !i2 && !i3 && !i4:
|
||||
return struct {
|
||||
Unwrapper
|
||||
http.ResponseWriter
|
||||
http.Flusher
|
||||
http.CloseNotifier
|
||||
}{rw, rw, rw, rw}
|
||||
// combination 26/32
|
||||
case i0 && i1 && !i2 && !i3 && i4:
|
||||
return struct {
|
||||
Unwrapper
|
||||
http.ResponseWriter
|
||||
http.Flusher
|
||||
http.CloseNotifier
|
||||
http.Pusher
|
||||
}{rw, rw, rw, rw, rw}
|
||||
// combination 27/32
|
||||
case i0 && i1 && !i2 && i3 && !i4:
|
||||
return struct {
|
||||
Unwrapper
|
||||
http.ResponseWriter
|
||||
http.Flusher
|
||||
http.CloseNotifier
|
||||
io.ReaderFrom
|
||||
}{rw, rw, rw, rw, rw}
|
||||
// combination 28/32
|
||||
case i0 && i1 && !i2 && i3 && i4:
|
||||
return struct {
|
||||
Unwrapper
|
||||
http.ResponseWriter
|
||||
http.Flusher
|
||||
http.CloseNotifier
|
||||
io.ReaderFrom
|
||||
http.Pusher
|
||||
}{rw, rw, rw, rw, rw, rw}
|
||||
// combination 29/32
|
||||
case i0 && i1 && i2 && !i3 && !i4:
|
||||
return struct {
|
||||
Unwrapper
|
||||
http.ResponseWriter
|
||||
http.Flusher
|
||||
http.CloseNotifier
|
||||
http.Hijacker
|
||||
}{rw, rw, rw, rw, rw}
|
||||
// combination 30/32
|
||||
case i0 && i1 && i2 && !i3 && i4:
|
||||
return struct {
|
||||
Unwrapper
|
||||
http.ResponseWriter
|
||||
http.Flusher
|
||||
http.CloseNotifier
|
||||
http.Hijacker
|
||||
http.Pusher
|
||||
}{rw, rw, rw, rw, rw, rw}
|
||||
// combination 31/32
|
||||
case i0 && i1 && i2 && i3 && !i4:
|
||||
return struct {
|
||||
Unwrapper
|
||||
http.ResponseWriter
|
||||
http.Flusher
|
||||
http.CloseNotifier
|
||||
http.Hijacker
|
||||
io.ReaderFrom
|
||||
}{rw, rw, rw, rw, rw, rw}
|
||||
// combination 32/32
|
||||
case i0 && i1 && i2 && i3 && i4:
|
||||
return struct {
|
||||
Unwrapper
|
||||
http.ResponseWriter
|
||||
http.Flusher
|
||||
http.CloseNotifier
|
||||
http.Hijacker
|
||||
io.ReaderFrom
|
||||
http.Pusher
|
||||
}{rw, rw, rw, rw, rw, rw, rw}
|
||||
}
|
||||
panic("unreachable")
|
||||
}
|
||||
|
||||
type rw struct {
|
||||
w http.ResponseWriter
|
||||
h Hooks
|
||||
}
|
||||
|
||||
func (w *rw) Unwrap() http.ResponseWriter {
|
||||
return w.w
|
||||
}
|
||||
|
||||
func (w *rw) Header() http.Header {
|
||||
f := w.w.(http.ResponseWriter).Header
|
||||
if w.h.Header != nil {
|
||||
f = w.h.Header(f)
|
||||
}
|
||||
return f()
|
||||
}
|
||||
|
||||
func (w *rw) WriteHeader(code int) {
|
||||
f := w.w.(http.ResponseWriter).WriteHeader
|
||||
if w.h.WriteHeader != nil {
|
||||
f = w.h.WriteHeader(f)
|
||||
}
|
||||
f(code)
|
||||
}
|
||||
|
||||
func (w *rw) Write(b []byte) (int, error) {
|
||||
f := w.w.(http.ResponseWriter).Write
|
||||
if w.h.Write != nil {
|
||||
f = w.h.Write(f)
|
||||
}
|
||||
return f(b)
|
||||
}
|
||||
|
||||
func (w *rw) Flush() {
|
||||
f := w.w.(http.Flusher).Flush
|
||||
if w.h.Flush != nil {
|
||||
f = w.h.Flush(f)
|
||||
}
|
||||
f()
|
||||
}
|
||||
|
||||
func (w *rw) CloseNotify() <-chan bool {
|
||||
f := w.w.(http.CloseNotifier).CloseNotify
|
||||
if w.h.CloseNotify != nil {
|
||||
f = w.h.CloseNotify(f)
|
||||
}
|
||||
return f()
|
||||
}
|
||||
|
||||
func (w *rw) Hijack() (net.Conn, *bufio.ReadWriter, error) {
|
||||
f := w.w.(http.Hijacker).Hijack
|
||||
if w.h.Hijack != nil {
|
||||
f = w.h.Hijack(f)
|
||||
}
|
||||
return f()
|
||||
}
|
||||
|
||||
func (w *rw) ReadFrom(src io.Reader) (int64, error) {
|
||||
f := w.w.(io.ReaderFrom).ReadFrom
|
||||
if w.h.ReadFrom != nil {
|
||||
f = w.h.ReadFrom(f)
|
||||
}
|
||||
return f(src)
|
||||
}
|
||||
|
||||
func (w *rw) Push(target string, opts *http.PushOptions) error {
|
||||
f := w.w.(http.Pusher).Push
|
||||
if w.h.Push != nil {
|
||||
f = w.h.Push(f)
|
||||
}
|
||||
return f(target, opts)
|
||||
}
|
||||
|
||||
type Unwrapper interface {
|
||||
Unwrap() http.ResponseWriter
|
||||
}
|
||||
|
||||
// Unwrap returns the underlying http.ResponseWriter from within zero or more
|
||||
// layers of httpsnoop wrappers.
|
||||
func Unwrap(w http.ResponseWriter) http.ResponseWriter {
|
||||
if rw, ok := w.(Unwrapper); ok {
|
||||
// recurse until rw.Unwrap() returns a non-Unwrapper
|
||||
return Unwrap(rw.Unwrap())
|
||||
} else {
|
||||
return w
|
||||
}
|
||||
}
|
||||
278
vendor/github.com/felixge/httpsnoop/wrap_generated_lt_1.8.go
generated
vendored
Normal file
278
vendor/github.com/felixge/httpsnoop/wrap_generated_lt_1.8.go
generated
vendored
Normal file
|
|
@ -0,0 +1,278 @@
|
|||
// +build !go1.8
|
||||
// Code generated by "httpsnoop/codegen"; DO NOT EDIT
|
||||
|
||||
package httpsnoop
|
||||
|
||||
import (
|
||||
"bufio"
|
||||
"io"
|
||||
"net"
|
||||
"net/http"
|
||||
)
|
||||
|
||||
// HeaderFunc is part of the http.ResponseWriter interface.
|
||||
type HeaderFunc func() http.Header
|
||||
|
||||
// WriteHeaderFunc is part of the http.ResponseWriter interface.
|
||||
type WriteHeaderFunc func(code int)
|
||||
|
||||
// WriteFunc is part of the http.ResponseWriter interface.
|
||||
type WriteFunc func(b []byte) (int, error)
|
||||
|
||||
// FlushFunc is part of the http.Flusher interface.
|
||||
type FlushFunc func()
|
||||
|
||||
// CloseNotifyFunc is part of the http.CloseNotifier interface.
|
||||
type CloseNotifyFunc func() <-chan bool
|
||||
|
||||
// HijackFunc is part of the http.Hijacker interface.
|
||||
type HijackFunc func() (net.Conn, *bufio.ReadWriter, error)
|
||||
|
||||
// ReadFromFunc is part of the io.ReaderFrom interface.
|
||||
type ReadFromFunc func(src io.Reader) (int64, error)
|
||||
|
||||
// Hooks defines a set of method interceptors for methods included in
|
||||
// http.ResponseWriter as well as some others. You can think of them as
|
||||
// middleware for the function calls they target. See Wrap for more details.
|
||||
type Hooks struct {
|
||||
Header func(HeaderFunc) HeaderFunc
|
||||
WriteHeader func(WriteHeaderFunc) WriteHeaderFunc
|
||||
Write func(WriteFunc) WriteFunc
|
||||
Flush func(FlushFunc) FlushFunc
|
||||
CloseNotify func(CloseNotifyFunc) CloseNotifyFunc
|
||||
Hijack func(HijackFunc) HijackFunc
|
||||
ReadFrom func(ReadFromFunc) ReadFromFunc
|
||||
}
|
||||
|
||||
// Wrap returns a wrapped version of w that provides the exact same interface
|
||||
// as w. Specifically if w implements any combination of:
|
||||
//
|
||||
// - http.Flusher
|
||||
// - http.CloseNotifier
|
||||
// - http.Hijacker
|
||||
// - io.ReaderFrom
|
||||
//
|
||||
// The wrapped version will implement the exact same combination. If no hooks
|
||||
// are set, the wrapped version also behaves exactly as w. Hooks targeting
|
||||
// methods not supported by w are ignored. Any other hooks will intercept the
|
||||
// method they target and may modify the call's arguments and/or return values.
|
||||
// The CaptureMetrics implementation serves as a working example for how the
|
||||
// hooks can be used.
|
||||
func Wrap(w http.ResponseWriter, hooks Hooks) http.ResponseWriter {
|
||||
rw := &rw{w: w, h: hooks}
|
||||
_, i0 := w.(http.Flusher)
|
||||
_, i1 := w.(http.CloseNotifier)
|
||||
_, i2 := w.(http.Hijacker)
|
||||
_, i3 := w.(io.ReaderFrom)
|
||||
switch {
|
||||
// combination 1/16
|
||||
case !i0 && !i1 && !i2 && !i3:
|
||||
return struct {
|
||||
Unwrapper
|
||||
http.ResponseWriter
|
||||
}{rw, rw}
|
||||
// combination 2/16
|
||||
case !i0 && !i1 && !i2 && i3:
|
||||
return struct {
|
||||
Unwrapper
|
||||
http.ResponseWriter
|
||||
io.ReaderFrom
|
||||
}{rw, rw, rw}
|
||||
// combination 3/16
|
||||
case !i0 && !i1 && i2 && !i3:
|
||||
return struct {
|
||||
Unwrapper
|
||||
http.ResponseWriter
|
||||
http.Hijacker
|
||||
}{rw, rw, rw}
|
||||
// combination 4/16
|
||||
case !i0 && !i1 && i2 && i3:
|
||||
return struct {
|
||||
Unwrapper
|
||||
http.ResponseWriter
|
||||
http.Hijacker
|
||||
io.ReaderFrom
|
||||
}{rw, rw, rw, rw}
|
||||
// combination 5/16
|
||||
case !i0 && i1 && !i2 && !i3:
|
||||
return struct {
|
||||
Unwrapper
|
||||
http.ResponseWriter
|
||||
http.CloseNotifier
|
||||
}{rw, rw, rw}
|
||||
// combination 6/16
|
||||
case !i0 && i1 && !i2 && i3:
|
||||
return struct {
|
||||
Unwrapper
|
||||
http.ResponseWriter
|
||||
http.CloseNotifier
|
||||
io.ReaderFrom
|
||||
}{rw, rw, rw, rw}
|
||||
// combination 7/16
|
||||
case !i0 && i1 && i2 && !i3:
|
||||
return struct {
|
||||
Unwrapper
|
||||
http.ResponseWriter
|
||||
http.CloseNotifier
|
||||
http.Hijacker
|
||||
}{rw, rw, rw, rw}
|
||||
// combination 8/16
|
||||
case !i0 && i1 && i2 && i3:
|
||||
return struct {
|
||||
Unwrapper
|
||||
http.ResponseWriter
|
||||
http.CloseNotifier
|
||||
http.Hijacker
|
||||
io.ReaderFrom
|
||||
}{rw, rw, rw, rw, rw}
|
||||
// combination 9/16
|
||||
case i0 && !i1 && !i2 && !i3:
|
||||
return struct {
|
||||
Unwrapper
|
||||
http.ResponseWriter
|
||||
http.Flusher
|
||||
}{rw, rw, rw}
|
||||
// combination 10/16
|
||||
case i0 && !i1 && !i2 && i3:
|
||||
return struct {
|
||||
Unwrapper
|
||||
http.ResponseWriter
|
||||
http.Flusher
|
||||
io.ReaderFrom
|
||||
}{rw, rw, rw, rw}
|
||||
// combination 11/16
|
||||
case i0 && !i1 && i2 && !i3:
|
||||
return struct {
|
||||
Unwrapper
|
||||
http.ResponseWriter
|
||||
http.Flusher
|
||||
http.Hijacker
|
||||
}{rw, rw, rw, rw}
|
||||
// combination 12/16
|
||||
case i0 && !i1 && i2 && i3:
|
||||
return struct {
|
||||
Unwrapper
|
||||
http.ResponseWriter
|
||||
http.Flusher
|
||||
http.Hijacker
|
||||
io.ReaderFrom
|
||||
}{rw, rw, rw, rw, rw}
|
||||
// combination 13/16
|
||||
case i0 && i1 && !i2 && !i3:
|
||||
return struct {
|
||||
Unwrapper
|
||||
http.ResponseWriter
|
||||
http.Flusher
|
||||
http.CloseNotifier
|
||||
}{rw, rw, rw, rw}
|
||||
// combination 14/16
|
||||
case i0 && i1 && !i2 && i3:
|
||||
return struct {
|
||||
Unwrapper
|
||||
http.ResponseWriter
|
||||
http.Flusher
|
||||
http.CloseNotifier
|
||||
io.ReaderFrom
|
||||
}{rw, rw, rw, rw, rw}
|
||||
// combination 15/16
|
||||
case i0 && i1 && i2 && !i3:
|
||||
return struct {
|
||||
Unwrapper
|
||||
http.ResponseWriter
|
||||
http.Flusher
|
||||
http.CloseNotifier
|
||||
http.Hijacker
|
||||
}{rw, rw, rw, rw, rw}
|
||||
// combination 16/16
|
||||
case i0 && i1 && i2 && i3:
|
||||
return struct {
|
||||
Unwrapper
|
||||
http.ResponseWriter
|
||||
http.Flusher
|
||||
http.CloseNotifier
|
||||
http.Hijacker
|
||||
io.ReaderFrom
|
||||
}{rw, rw, rw, rw, rw, rw}
|
||||
}
|
||||
panic("unreachable")
|
||||
}
|
||||
|
||||
type rw struct {
|
||||
w http.ResponseWriter
|
||||
h Hooks
|
||||
}
|
||||
|
||||
func (w *rw) Unwrap() http.ResponseWriter {
|
||||
return w.w
|
||||
}
|
||||
|
||||
func (w *rw) Header() http.Header {
|
||||
f := w.w.(http.ResponseWriter).Header
|
||||
if w.h.Header != nil {
|
||||
f = w.h.Header(f)
|
||||
}
|
||||
return f()
|
||||
}
|
||||
|
||||
func (w *rw) WriteHeader(code int) {
|
||||
f := w.w.(http.ResponseWriter).WriteHeader
|
||||
if w.h.WriteHeader != nil {
|
||||
f = w.h.WriteHeader(f)
|
||||
}
|
||||
f(code)
|
||||
}
|
||||
|
||||
func (w *rw) Write(b []byte) (int, error) {
|
||||
f := w.w.(http.ResponseWriter).Write
|
||||
if w.h.Write != nil {
|
||||
f = w.h.Write(f)
|
||||
}
|
||||
return f(b)
|
||||
}
|
||||
|
||||
func (w *rw) Flush() {
|
||||
f := w.w.(http.Flusher).Flush
|
||||
if w.h.Flush != nil {
|
||||
f = w.h.Flush(f)
|
||||
}
|
||||
f()
|
||||
}
|
||||
|
||||
func (w *rw) CloseNotify() <-chan bool {
|
||||
f := w.w.(http.CloseNotifier).CloseNotify
|
||||
if w.h.CloseNotify != nil {
|
||||
f = w.h.CloseNotify(f)
|
||||
}
|
||||
return f()
|
||||
}
|
||||
|
||||
func (w *rw) Hijack() (net.Conn, *bufio.ReadWriter, error) {
|
||||
f := w.w.(http.Hijacker).Hijack
|
||||
if w.h.Hijack != nil {
|
||||
f = w.h.Hijack(f)
|
||||
}
|
||||
return f()
|
||||
}
|
||||
|
||||
func (w *rw) ReadFrom(src io.Reader) (int64, error) {
|
||||
f := w.w.(io.ReaderFrom).ReadFrom
|
||||
if w.h.ReadFrom != nil {
|
||||
f = w.h.ReadFrom(f)
|
||||
}
|
||||
return f(src)
|
||||
}
|
||||
|
||||
type Unwrapper interface {
|
||||
Unwrap() http.ResponseWriter
|
||||
}
|
||||
|
||||
// Unwrap returns the underlying http.ResponseWriter from within zero or more
|
||||
// layers of httpsnoop wrappers.
|
||||
func Unwrap(w http.ResponseWriter) http.ResponseWriter {
|
||||
if rw, ok := w.(Unwrapper); ok {
|
||||
// recurse until rw.Unwrap() returns a non-Unwrapper
|
||||
return Unwrap(rw.Unwrap())
|
||||
} else {
|
||||
return w
|
||||
}
|
||||
}
|
||||
5
vendor/github.com/go-openapi/analysis/.codecov.yml
generated
vendored
Normal file
5
vendor/github.com/go-openapi/analysis/.codecov.yml
generated
vendored
Normal file
|
|
@ -0,0 +1,5 @@
|
|||
coverage:
|
||||
status:
|
||||
patch:
|
||||
default:
|
||||
target: 80%
|
||||
2
vendor/github.com/go-openapi/analysis/.gitattributes
generated
vendored
Normal file
2
vendor/github.com/go-openapi/analysis/.gitattributes
generated
vendored
Normal file
|
|
@ -0,0 +1,2 @@
|
|||
*.go text eol=lf
|
||||
|
||||
5
vendor/github.com/go-openapi/analysis/.gitignore
generated
vendored
Normal file
5
vendor/github.com/go-openapi/analysis/.gitignore
generated
vendored
Normal file
|
|
@ -0,0 +1,5 @@
|
|||
secrets.yml
|
||||
coverage.out
|
||||
coverage.txt
|
||||
*.cov
|
||||
.idea
|
||||
56
vendor/github.com/go-openapi/analysis/.golangci.yml
generated
vendored
Normal file
56
vendor/github.com/go-openapi/analysis/.golangci.yml
generated
vendored
Normal file
|
|
@ -0,0 +1,56 @@
|
|||
linters-settings:
|
||||
govet:
|
||||
check-shadowing: true
|
||||
golint:
|
||||
min-confidence: 0
|
||||
gocyclo:
|
||||
min-complexity: 40
|
||||
gocognit:
|
||||
min-complexity: 40
|
||||
maligned:
|
||||
suggest-new: true
|
||||
dupl:
|
||||
threshold: 150
|
||||
goconst:
|
||||
min-len: 2
|
||||
min-occurrences: 4
|
||||
|
||||
linters:
|
||||
enable-all: true
|
||||
disable:
|
||||
- maligned
|
||||
- lll
|
||||
- gochecknoglobals
|
||||
- gochecknoinits
|
||||
# scopelint is useful, but also reports false positives
|
||||
# that unfortunately can't be disabled. So we disable the
|
||||
# linter rather than changing code that works.
|
||||
# see: https://github.com/kyoh86/scopelint/issues/4
|
||||
- scopelint
|
||||
- godox
|
||||
- gocognit
|
||||
#- whitespace
|
||||
- wsl
|
||||
- funlen
|
||||
- testpackage
|
||||
- wrapcheck
|
||||
#- nlreturn
|
||||
- gomnd
|
||||
- goerr113
|
||||
- exhaustivestruct
|
||||
#- errorlint
|
||||
#- nestif
|
||||
- gofumpt
|
||||
- godot
|
||||
- gci
|
||||
- dogsled
|
||||
- paralleltest
|
||||
- tparallel
|
||||
- thelper
|
||||
- ifshort
|
||||
- forbidigo
|
||||
- cyclop
|
||||
- varnamelen
|
||||
- exhaustruct
|
||||
- nonamedreturns
|
||||
- nosnakecase
|
||||
74
vendor/github.com/go-openapi/analysis/CODE_OF_CONDUCT.md
generated
vendored
Normal file
74
vendor/github.com/go-openapi/analysis/CODE_OF_CONDUCT.md
generated
vendored
Normal file
|
|
@ -0,0 +1,74 @@
|
|||
# Contributor Covenant Code of Conduct
|
||||
|
||||
## Our Pledge
|
||||
|
||||
In the interest of fostering an open and welcoming environment, we as
|
||||
contributors and maintainers pledge to making participation in our project and
|
||||
our community a harassment-free experience for everyone, regardless of age, body
|
||||
size, disability, ethnicity, gender identity and expression, level of experience,
|
||||
nationality, personal appearance, race, religion, or sexual identity and
|
||||
orientation.
|
||||
|
||||
## Our Standards
|
||||
|
||||
Examples of behavior that contributes to creating a positive environment
|
||||
include:
|
||||
|
||||
* Using welcoming and inclusive language
|
||||
* Being respectful of differing viewpoints and experiences
|
||||
* Gracefully accepting constructive criticism
|
||||
* Focusing on what is best for the community
|
||||
* Showing empathy towards other community members
|
||||
|
||||
Examples of unacceptable behavior by participants include:
|
||||
|
||||
* The use of sexualized language or imagery and unwelcome sexual attention or
|
||||
advances
|
||||
* Trolling, insulting/derogatory comments, and personal or political attacks
|
||||
* Public or private harassment
|
||||
* Publishing others' private information, such as a physical or electronic
|
||||
address, without explicit permission
|
||||
* Other conduct which could reasonably be considered inappropriate in a
|
||||
professional setting
|
||||
|
||||
## Our Responsibilities
|
||||
|
||||
Project maintainers are responsible for clarifying the standards of acceptable
|
||||
behavior and are expected to take appropriate and fair corrective action in
|
||||
response to any instances of unacceptable behavior.
|
||||
|
||||
Project maintainers have the right and responsibility to remove, edit, or
|
||||
reject comments, commits, code, wiki edits, issues, and other contributions
|
||||
that are not aligned to this Code of Conduct, or to ban temporarily or
|
||||
permanently any contributor for other behaviors that they deem inappropriate,
|
||||
threatening, offensive, or harmful.
|
||||
|
||||
## Scope
|
||||
|
||||
This Code of Conduct applies both within project spaces and in public spaces
|
||||
when an individual is representing the project or its community. Examples of
|
||||
representing a project or community include using an official project e-mail
|
||||
address, posting via an official social media account, or acting as an appointed
|
||||
representative at an online or offline event. Representation of a project may be
|
||||
further defined and clarified by project maintainers.
|
||||
|
||||
## Enforcement
|
||||
|
||||
Instances of abusive, harassing, or otherwise unacceptable behavior may be
|
||||
reported by contacting the project team at ivan+abuse@flanders.co.nz. All
|
||||
complaints will be reviewed and investigated and will result in a response that
|
||||
is deemed necessary and appropriate to the circumstances. The project team is
|
||||
obligated to maintain confidentiality with regard to the reporter of an incident.
|
||||
Further details of specific enforcement policies may be posted separately.
|
||||
|
||||
Project maintainers who do not follow or enforce the Code of Conduct in good
|
||||
faith may face temporary or permanent repercussions as determined by other
|
||||
members of the project's leadership.
|
||||
|
||||
## Attribution
|
||||
|
||||
This Code of Conduct is adapted from the [Contributor Covenant][homepage], version 1.4,
|
||||
available at [http://contributor-covenant.org/version/1/4][version]
|
||||
|
||||
[homepage]: http://contributor-covenant.org
|
||||
[version]: http://contributor-covenant.org/version/1/4/
|
||||
202
vendor/github.com/go-openapi/analysis/LICENSE
generated
vendored
Normal file
202
vendor/github.com/go-openapi/analysis/LICENSE
generated
vendored
Normal file
|
|
@ -0,0 +1,202 @@
|
|||
|
||||
Apache License
|
||||
Version 2.0, January 2004
|
||||
http://www.apache.org/licenses/
|
||||
|
||||
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
|
||||
|
||||
1. Definitions.
|
||||
|
||||
"License" shall mean the terms and conditions for use, reproduction,
|
||||
and distribution as defined by Sections 1 through 9 of this document.
|
||||
|
||||
"Licensor" shall mean the copyright owner or entity authorized by
|
||||
the copyright owner that is granting the License.
|
||||
|
||||
"Legal Entity" shall mean the union of the acting entity and all
|
||||
other entities that control, are controlled by, or are under common
|
||||
control with that entity. For the purposes of this definition,
|
||||
"control" means (i) the power, direct or indirect, to cause the
|
||||
direction or management of such entity, whether by contract or
|
||||
otherwise, or (ii) ownership of fifty percent (50%) or more of the
|
||||
outstanding shares, or (iii) beneficial ownership of such entity.
|
||||
|
||||
"You" (or "Your") shall mean an individual or Legal Entity
|
||||
exercising permissions granted by this License.
|
||||
|
||||
"Source" form shall mean the preferred form for making modifications,
|
||||
including but not limited to software source code, documentation
|
||||
source, and configuration files.
|
||||
|
||||
"Object" form shall mean any form resulting from mechanical
|
||||
transformation or translation of a Source form, including but
|
||||
not limited to compiled object code, generated documentation,
|
||||
and conversions to other media types.
|
||||
|
||||
"Work" shall mean the work of authorship, whether in Source or
|
||||
Object form, made available under the License, as indicated by a
|
||||
copyright notice that is included in or attached to the work
|
||||
(an example is provided in the Appendix below).
|
||||
|
||||
"Derivative Works" shall mean any work, whether in Source or Object
|
||||
form, that is based on (or derived from) the Work and for which the
|
||||
editorial revisions, annotations, elaborations, or other modifications
|
||||
represent, as a whole, an original work of authorship. For the purposes
|
||||
of this License, Derivative Works shall not include works that remain
|
||||
separable from, or merely link (or bind by name) to the interfaces of,
|
||||
the Work and Derivative Works thereof.
|
||||
|
||||
"Contribution" shall mean any work of authorship, including
|
||||
the original version of the Work and any modifications or additions
|
||||
to that Work or Derivative Works thereof, that is intentionally
|
||||
submitted to Licensor for inclusion in the Work by the copyright owner
|
||||
or by an individual or Legal Entity authorized to submit on behalf of
|
||||
the copyright owner. For the purposes of this definition, "submitted"
|
||||
means any form of electronic, verbal, or written communication sent
|
||||
to the Licensor or its representatives, including but not limited to
|
||||
communication on electronic mailing lists, source code control systems,
|
||||
and issue tracking systems that are managed by, or on behalf of, the
|
||||
Licensor for the purpose of discussing and improving the Work, but
|
||||
excluding communication that is conspicuously marked or otherwise
|
||||
designated in writing by the copyright owner as "Not a Contribution."
|
||||
|
||||
"Contributor" shall mean Licensor and any individual or Legal Entity
|
||||
on behalf of whom a Contribution has been received by Licensor and
|
||||
subsequently incorporated within the Work.
|
||||
|
||||
2. Grant of Copyright License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
copyright license to reproduce, prepare Derivative Works of,
|
||||
publicly display, publicly perform, sublicense, and distribute the
|
||||
Work and such Derivative Works in Source or Object form.
|
||||
|
||||
3. Grant of Patent License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
(except as stated in this section) patent license to make, have made,
|
||||
use, offer to sell, sell, import, and otherwise transfer the Work,
|
||||
where such license applies only to those patent claims licensable
|
||||
by such Contributor that are necessarily infringed by their
|
||||
Contribution(s) alone or by combination of their Contribution(s)
|
||||
with the Work to which such Contribution(s) was submitted. If You
|
||||
institute patent litigation against any entity (including a
|
||||
cross-claim or counterclaim in a lawsuit) alleging that the Work
|
||||
or a Contribution incorporated within the Work constitutes direct
|
||||
or contributory patent infringement, then any patent licenses
|
||||
granted to You under this License for that Work shall terminate
|
||||
as of the date such litigation is filed.
|
||||
|
||||
4. Redistribution. You may reproduce and distribute copies of the
|
||||
Work or Derivative Works thereof in any medium, with or without
|
||||
modifications, and in Source or Object form, provided that You
|
||||
meet the following conditions:
|
||||
|
||||
(a) You must give any other recipients of the Work or
|
||||
Derivative Works a copy of this License; and
|
||||
|
||||
(b) You must cause any modified files to carry prominent notices
|
||||
stating that You changed the files; and
|
||||
|
||||
(c) You must retain, in the Source form of any Derivative Works
|
||||
that You distribute, all copyright, patent, trademark, and
|
||||
attribution notices from the Source form of the Work,
|
||||
excluding those notices that do not pertain to any part of
|
||||
the Derivative Works; and
|
||||
|
||||
(d) If the Work includes a "NOTICE" text file as part of its
|
||||
distribution, then any Derivative Works that You distribute must
|
||||
include a readable copy of the attribution notices contained
|
||||
within such NOTICE file, excluding those notices that do not
|
||||
pertain to any part of the Derivative Works, in at least one
|
||||
of the following places: within a NOTICE text file distributed
|
||||
as part of the Derivative Works; within the Source form or
|
||||
documentation, if provided along with the Derivative Works; or,
|
||||
within a display generated by the Derivative Works, if and
|
||||
wherever such third-party notices normally appear. The contents
|
||||
of the NOTICE file are for informational purposes only and
|
||||
do not modify the License. You may add Your own attribution
|
||||
notices within Derivative Works that You distribute, alongside
|
||||
or as an addendum to the NOTICE text from the Work, provided
|
||||
that such additional attribution notices cannot be construed
|
||||
as modifying the License.
|
||||
|
||||
You may add Your own copyright statement to Your modifications and
|
||||
may provide additional or different license terms and conditions
|
||||
for use, reproduction, or distribution of Your modifications, or
|
||||
for any such Derivative Works as a whole, provided Your use,
|
||||
reproduction, and distribution of the Work otherwise complies with
|
||||
the conditions stated in this License.
|
||||
|
||||
5. Submission of Contributions. Unless You explicitly state otherwise,
|
||||
any Contribution intentionally submitted for inclusion in the Work
|
||||
by You to the Licensor shall be under the terms and conditions of
|
||||
this License, without any additional terms or conditions.
|
||||
Notwithstanding the above, nothing herein shall supersede or modify
|
||||
the terms of any separate license agreement you may have executed
|
||||
with Licensor regarding such Contributions.
|
||||
|
||||
6. Trademarks. This License does not grant permission to use the trade
|
||||
names, trademarks, service marks, or product names of the Licensor,
|
||||
except as required for reasonable and customary use in describing the
|
||||
origin of the Work and reproducing the content of the NOTICE file.
|
||||
|
||||
7. Disclaimer of Warranty. Unless required by applicable law or
|
||||
agreed to in writing, Licensor provides the Work (and each
|
||||
Contributor provides its Contributions) on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
||||
implied, including, without limitation, any warranties or conditions
|
||||
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
|
||||
PARTICULAR PURPOSE. You are solely responsible for determining the
|
||||
appropriateness of using or redistributing the Work and assume any
|
||||
risks associated with Your exercise of permissions under this License.
|
||||
|
||||
8. Limitation of Liability. In no event and under no legal theory,
|
||||
whether in tort (including negligence), contract, or otherwise,
|
||||
unless required by applicable law (such as deliberate and grossly
|
||||
negligent acts) or agreed to in writing, shall any Contributor be
|
||||
liable to You for damages, including any direct, indirect, special,
|
||||
incidental, or consequential damages of any character arising as a
|
||||
result of this License or out of the use or inability to use the
|
||||
Work (including but not limited to damages for loss of goodwill,
|
||||
work stoppage, computer failure or malfunction, or any and all
|
||||
other commercial damages or losses), even if such Contributor
|
||||
has been advised of the possibility of such damages.
|
||||
|
||||
9. Accepting Warranty or Additional Liability. While redistributing
|
||||
the Work or Derivative Works thereof, You may choose to offer,
|
||||
and charge a fee for, acceptance of support, warranty, indemnity,
|
||||
or other liability obligations and/or rights consistent with this
|
||||
License. However, in accepting such obligations, You may act only
|
||||
on Your own behalf and on Your sole responsibility, not on behalf
|
||||
of any other Contributor, and only if You agree to indemnify,
|
||||
defend, and hold each Contributor harmless for any liability
|
||||
incurred by, or claims asserted against, such Contributor by reason
|
||||
of your accepting any such warranty or additional liability.
|
||||
|
||||
END OF TERMS AND CONDITIONS
|
||||
|
||||
APPENDIX: How to apply the Apache License to your work.
|
||||
|
||||
To apply the Apache License to your work, attach the following
|
||||
boilerplate notice, with the fields enclosed by brackets "[]"
|
||||
replaced with your own identifying information. (Don't include
|
||||
the brackets!) The text should be enclosed in the appropriate
|
||||
comment syntax for the file format. We also recommend that a
|
||||
file or class name and description of purpose be included on the
|
||||
same "printed page" as the copyright notice for easier
|
||||
identification within third-party archives.
|
||||
|
||||
Copyright [yyyy] [name of copyright owner]
|
||||
|
||||
Licensed under the Apache License, Version 2.0 (the "License");
|
||||
you may not use this file except in compliance with the License.
|
||||
You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing, software
|
||||
distributed under the License is distributed on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
See the License for the specific language governing permissions and
|
||||
limitations under the License.
|
||||
31
vendor/github.com/go-openapi/analysis/README.md
generated
vendored
Normal file
31
vendor/github.com/go-openapi/analysis/README.md
generated
vendored
Normal file
|
|
@ -0,0 +1,31 @@
|
|||
# OpenAPI initiative analysis
|
||||
|
||||
[](https://travis-ci.org/go-openapi/analysis)
|
||||
[](https://ci.appveyor.com/project/casualjim/go-openapi/analysis/branch/master)
|
||||
[](https://codecov.io/gh/go-openapi/analysis)
|
||||
[](https://slackin.goswagger.io)
|
||||
[](https://raw.githubusercontent.com/go-openapi/analysis/master/LICENSE)
|
||||
[](https://pkg.go.dev/github.com/go-openapi/analysis)
|
||||
[](https://goreportcard.com/report/github.com/go-openapi/analysis)
|
||||
|
||||
|
||||
A foundational library to analyze an OAI specification document for easier reasoning about the content.
|
||||
|
||||
## What's inside?
|
||||
|
||||
* A analyzer providing methods to walk the functional content of a specification
|
||||
* A spec flattener producing a self-contained document bundle, while preserving `$ref`s
|
||||
* A spec merger ("mixin") to merge several spec documents into a primary spec
|
||||
* A spec "fixer" ensuring that response descriptions are non empty
|
||||
|
||||
[Documentation](https://godoc.org/github.com/go-openapi/analysis)
|
||||
|
||||
## FAQ
|
||||
|
||||
* Does this library support OpenAPI 3?
|
||||
|
||||
> No.
|
||||
> This package currently only supports OpenAPI 2.0 (aka Swagger 2.0).
|
||||
> There is no plan to make it evolve toward supporting OpenAPI 3.x.
|
||||
> This [discussion thread](https://github.com/go-openapi/spec/issues/21) relates the full story.
|
||||
>
|
||||
1064
vendor/github.com/go-openapi/analysis/analyzer.go
generated
vendored
Normal file
1064
vendor/github.com/go-openapi/analysis/analyzer.go
generated
vendored
Normal file
File diff suppressed because it is too large
Load diff
32
vendor/github.com/go-openapi/analysis/appveyor.yml
generated
vendored
Normal file
32
vendor/github.com/go-openapi/analysis/appveyor.yml
generated
vendored
Normal file
|
|
@ -0,0 +1,32 @@
|
|||
version: "0.1.{build}"
|
||||
|
||||
clone_folder: C:\go-openapi\analysis
|
||||
shallow_clone: true # for startup speed
|
||||
pull_requests:
|
||||
do_not_increment_build_number: true
|
||||
|
||||
#skip_tags: true
|
||||
#skip_branch_with_pr: true
|
||||
|
||||
# appveyor.yml
|
||||
build: off
|
||||
|
||||
environment:
|
||||
GOPATH: c:\gopath
|
||||
|
||||
stack: go 1.16
|
||||
|
||||
test_script:
|
||||
- go test -v -timeout 20m ./...
|
||||
|
||||
deploy: off
|
||||
|
||||
notifications:
|
||||
- provider: Slack
|
||||
incoming_webhook: https://hooks.slack.com/services/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
|
||||
auth_token:
|
||||
secure: Sf7kZf7ZGbnwWUMpffHwMu5A0cHkLK2MYY32LNTPj4+/3qC3Ghl7+9v4TSLOqOlCwdRNjOGblAq7s+GDJed6/xgRQl1JtCi1klzZNrYX4q01pgTPvvGcwbBkIYgeMaPeIRcK9OZnud7sRXdttozgTOpytps2U6Js32ip7uj5mHSg2ub0FwoSJwlS6dbezZ8+eDhoha0F/guY99BEwx8Bd+zROrT2TFGsSGOFGN6wFc7moCqTHO/YkWib13a2QNXqOxCCVBy/lt76Wp+JkeFppjHlzs/2lP3EAk13RIUAaesdEUHvIHrzCyNJEd3/+KO2DzsWOYfpktd+KBCvgaYOsoo7ubdT3IROeAegZdCgo/6xgCEsmFc9ZcqCfN5yNx2A+BZ2Vwmpws+bQ1E1+B5HDzzaiLcYfG4X2O210QVGVDLWsv1jqD+uPYeHY2WRfh5ZsIUFvaqgUEnwHwrK44/8REAhQavt1QAj5uJpsRd7CkRVPWRNK+yIky+wgbVUFEchRNmS55E7QWf+W4+4QZkQi7vUTMc9nbTUu2Es9NfvfudOpM2wZbn98fjpb/qq/nRv6Bk+ca+7XD5/IgNLMbWp2ouDdzbiHLCOfDUiHiDJhLfFZx9Bwo7ZwfzeOlbrQX66bx7xRKYmOe4DLrXhNcpbsMa8qbfxlZRCmYbubB/Y8h4=
|
||||
channel: bots
|
||||
on_build_success: false
|
||||
on_build_failure: true
|
||||
on_build_status_changed: true
|
||||
23
vendor/github.com/go-openapi/analysis/debug.go
generated
vendored
Normal file
23
vendor/github.com/go-openapi/analysis/debug.go
generated
vendored
Normal file
|
|
@ -0,0 +1,23 @@
|
|||
// Copyright 2015 go-swagger maintainers
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 (the "License");
|
||||
// you may not use this file except in compliance with the License.
|
||||
// You may obtain a copy of the License at
|
||||
//
|
||||
// http://www.apache.org/licenses/LICENSE-2.0
|
||||
//
|
||||
// Unless required by applicable law or agreed to in writing, software
|
||||
// distributed under the License is distributed on an "AS IS" BASIS,
|
||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
// See the License for the specific language governing permissions and
|
||||
// limitations under the License.
|
||||
|
||||
package analysis
|
||||
|
||||
import (
|
||||
"os"
|
||||
|
||||
"github.com/go-openapi/analysis/internal/debug"
|
||||
)
|
||||
|
||||
var debugLog = debug.GetLogger("analysis", os.Getenv("SWAGGER_DEBUG") != "")
|
||||
43
vendor/github.com/go-openapi/analysis/doc.go
generated
vendored
Normal file
43
vendor/github.com/go-openapi/analysis/doc.go
generated
vendored
Normal file
|
|
@ -0,0 +1,43 @@
|
|||
// Copyright 2015 go-swagger maintainers
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 (the "License");
|
||||
// you may not use this file except in compliance with the License.
|
||||
// You may obtain a copy of the License at
|
||||
//
|
||||
// http://www.apache.org/licenses/LICENSE-2.0
|
||||
//
|
||||
// Unless required by applicable law or agreed to in writing, software
|
||||
// distributed under the License is distributed on an "AS IS" BASIS,
|
||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
// See the License for the specific language governing permissions and
|
||||
// limitations under the License.
|
||||
|
||||
/*
|
||||
Package analysis provides methods to work with a Swagger specification document from
|
||||
package go-openapi/spec.
|
||||
|
||||
Analyzing a specification
|
||||
|
||||
An analysed specification object (type Spec) provides methods to work with swagger definition.
|
||||
|
||||
Flattening or expanding a specification
|
||||
|
||||
Flattening a specification bundles all remote $ref in the main spec document.
|
||||
Depending on flattening options, additional preprocessing may take place:
|
||||
- full flattening: replacing all inline complex constructs by a named entry in #/definitions
|
||||
- expand: replace all $ref's in the document by their expanded content
|
||||
|
||||
Merging several specifications
|
||||
|
||||
Mixin several specifications merges all Swagger constructs, and warns about found conflicts.
|
||||
|
||||
Fixing a specification
|
||||
|
||||
Unmarshalling a specification with golang json unmarshalling may lead to
|
||||
some unwanted result on present but empty fields.
|
||||
|
||||
Analyzing a Swagger schema
|
||||
|
||||
Swagger schemas are analyzed to determine their complexity and qualify their content.
|
||||
*/
|
||||
package analysis
|
||||
79
vendor/github.com/go-openapi/analysis/fixer.go
generated
vendored
Normal file
79
vendor/github.com/go-openapi/analysis/fixer.go
generated
vendored
Normal file
|
|
@ -0,0 +1,79 @@
|
|||
// Copyright 2015 go-swagger maintainers
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 (the "License");
|
||||
// you may not use this file except in compliance with the License.
|
||||
// You may obtain a copy of the License at
|
||||
//
|
||||
// http://www.apache.org/licenses/LICENSE-2.0
|
||||
//
|
||||
// Unless required by applicable law or agreed to in writing, software
|
||||
// distributed under the License is distributed on an "AS IS" BASIS,
|
||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
// See the License for the specific language governing permissions and
|
||||
// limitations under the License.
|
||||
|
||||
package analysis
|
||||
|
||||
import "github.com/go-openapi/spec"
|
||||
|
||||
// FixEmptyResponseDescriptions replaces empty ("") response
|
||||
// descriptions in the input with "(empty)" to ensure that the
|
||||
// resulting Swagger is stays valid. The problem appears to arise
|
||||
// from reading in valid specs that have a explicit response
|
||||
// description of "" (valid, response.description is required), but
|
||||
// due to zero values being omitted upon re-serializing (omitempty) we
|
||||
// lose them unless we stick some chars in there.
|
||||
func FixEmptyResponseDescriptions(s *spec.Swagger) {
|
||||
for k, v := range s.Responses {
|
||||
FixEmptyDesc(&v) //#nosec
|
||||
s.Responses[k] = v
|
||||
}
|
||||
|
||||
if s.Paths == nil {
|
||||
return
|
||||
}
|
||||
|
||||
for _, v := range s.Paths.Paths {
|
||||
if v.Get != nil {
|
||||
FixEmptyDescs(v.Get.Responses)
|
||||
}
|
||||
if v.Put != nil {
|
||||
FixEmptyDescs(v.Put.Responses)
|
||||
}
|
||||
if v.Post != nil {
|
||||
FixEmptyDescs(v.Post.Responses)
|
||||
}
|
||||
if v.Delete != nil {
|
||||
FixEmptyDescs(v.Delete.Responses)
|
||||
}
|
||||
if v.Options != nil {
|
||||
FixEmptyDescs(v.Options.Responses)
|
||||
}
|
||||
if v.Head != nil {
|
||||
FixEmptyDescs(v.Head.Responses)
|
||||
}
|
||||
if v.Patch != nil {
|
||||
FixEmptyDescs(v.Patch.Responses)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// FixEmptyDescs adds "(empty)" as the description for any Response in
|
||||
// the given Responses object that doesn't already have one.
|
||||
func FixEmptyDescs(rs *spec.Responses) {
|
||||
FixEmptyDesc(rs.Default)
|
||||
for k, v := range rs.StatusCodeResponses {
|
||||
FixEmptyDesc(&v) //#nosec
|
||||
rs.StatusCodeResponses[k] = v
|
||||
}
|
||||
}
|
||||
|
||||
// FixEmptyDesc adds "(empty)" as the description to the given
|
||||
// Response object if it doesn't already have one and isn't a
|
||||
// ref. No-op on nil input.
|
||||
func FixEmptyDesc(rs *spec.Response) {
|
||||
if rs == nil || rs.Description != "" || rs.Ref.Ref.GetURL() != nil {
|
||||
return
|
||||
}
|
||||
rs.Description = "(empty)"
|
||||
}
|
||||
802
vendor/github.com/go-openapi/analysis/flatten.go
generated
vendored
Normal file
802
vendor/github.com/go-openapi/analysis/flatten.go
generated
vendored
Normal file
|
|
@ -0,0 +1,802 @@
|
|||
// Copyright 2015 go-swagger maintainers
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 (the "License");
|
||||
// you may not use this file except in compliance with the License.
|
||||
// You may obtain a copy of the License at
|
||||
//
|
||||
// http://www.apache.org/licenses/LICENSE-2.0
|
||||
//
|
||||
// Unless required by applicable law or agreed to in writing, software
|
||||
// distributed under the License is distributed on an "AS IS" BASIS,
|
||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
// See the License for the specific language governing permissions and
|
||||
// limitations under the License.
|
||||
|
||||
package analysis
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"log"
|
||||
"path"
|
||||
"sort"
|
||||
"strings"
|
||||
|
||||
"github.com/go-openapi/analysis/internal/flatten/normalize"
|
||||
"github.com/go-openapi/analysis/internal/flatten/operations"
|
||||
"github.com/go-openapi/analysis/internal/flatten/replace"
|
||||
"github.com/go-openapi/analysis/internal/flatten/schutils"
|
||||
"github.com/go-openapi/analysis/internal/flatten/sortref"
|
||||
"github.com/go-openapi/jsonpointer"
|
||||
"github.com/go-openapi/spec"
|
||||
)
|
||||
|
||||
const definitionsPath = "#/definitions"
|
||||
|
||||
// newRef stores information about refs created during the flattening process
|
||||
type newRef struct {
|
||||
key string
|
||||
newName string
|
||||
path string
|
||||
isOAIGen bool
|
||||
resolved bool
|
||||
schema *spec.Schema
|
||||
parents []string
|
||||
}
|
||||
|
||||
// context stores intermediary results from flatten
|
||||
type context struct {
|
||||
newRefs map[string]*newRef
|
||||
warnings []string
|
||||
resolved map[string]string
|
||||
}
|
||||
|
||||
func newContext() *context {
|
||||
return &context{
|
||||
newRefs: make(map[string]*newRef, 150),
|
||||
warnings: make([]string, 0),
|
||||
resolved: make(map[string]string, 50),
|
||||
}
|
||||
}
|
||||
|
||||
// Flatten an analyzed spec and produce a self-contained spec bundle.
|
||||
//
|
||||
// There is a minimal and a full flattening mode.
|
||||
//
|
||||
//
|
||||
// Minimally flattening a spec means:
|
||||
// - Expanding parameters, responses, path items, parameter items and header items (references to schemas are left
|
||||
// unscathed)
|
||||
// - Importing external (http, file) references so they become internal to the document
|
||||
// - Moving every JSON pointer to a $ref to a named definition (i.e. the reworked spec does not contain pointers
|
||||
// like "$ref": "#/definitions/myObject/allOfs/1")
|
||||
//
|
||||
// A minimally flattened spec thus guarantees the following properties:
|
||||
// - all $refs point to a local definition (i.e. '#/definitions/...')
|
||||
// - definitions are unique
|
||||
//
|
||||
// NOTE: arbitrary JSON pointers (other than $refs to top level definitions) are rewritten as definitions if they
|
||||
// represent a complex schema or express commonality in the spec.
|
||||
// Otherwise, they are simply expanded.
|
||||
// Self-referencing JSON pointers cannot resolve to a type and trigger an error.
|
||||
//
|
||||
//
|
||||
// Minimal flattening is necessary and sufficient for codegen rendering using go-swagger.
|
||||
//
|
||||
// Fully flattening a spec means:
|
||||
// - Moving every complex inline schema to be a definition with an auto-generated name in a depth-first fashion.
|
||||
//
|
||||
// By complex, we mean every JSON object with some properties.
|
||||
// Arrays, when they do not define a tuple,
|
||||
// or empty objects with or without additionalProperties, are not considered complex and remain inline.
|
||||
//
|
||||
// NOTE: rewritten schemas get a vendor extension x-go-gen-location so we know from which part of the spec definitions
|
||||
// have been created.
|
||||
//
|
||||
// Available flattening options:
|
||||
// - Minimal: stops flattening after minimal $ref processing, leaving schema constructs untouched
|
||||
// - Expand: expand all $ref's in the document (inoperant if Minimal set to true)
|
||||
// - Verbose: croaks about name conflicts detected
|
||||
// - RemoveUnused: removes unused parameters, responses and definitions after expansion/flattening
|
||||
//
|
||||
// NOTE: expansion removes all $ref save circular $ref, which remain in place
|
||||
//
|
||||
// TODO: additional options
|
||||
// - ProgagateNameExtensions: ensure that created entries properly follow naming rules when their parent have set a
|
||||
// x-go-name extension
|
||||
// - LiftAllOfs:
|
||||
// - limit the flattening of allOf members when simple objects
|
||||
// - merge allOf with validation only
|
||||
// - merge allOf with extensions only
|
||||
// - ...
|
||||
//
|
||||
func Flatten(opts FlattenOpts) error {
|
||||
debugLog("FlattenOpts: %#v", opts)
|
||||
|
||||
opts.flattenContext = newContext()
|
||||
|
||||
// 1. Recursively expand responses, parameters, path items and items in simple schemas.
|
||||
//
|
||||
// This simplifies the spec and leaves only the $ref's in schema objects.
|
||||
if err := expand(&opts); err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
// 2. Strip the current document from absolute $ref's that actually a in the root,
|
||||
// so we can recognize them as proper definitions
|
||||
//
|
||||
// In particular, this works around issue go-openapi/spec#76: leading absolute file in $ref is stripped
|
||||
if err := normalizeRef(&opts); err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
// 3. Optionally remove shared parameters and responses already expanded (now unused).
|
||||
//
|
||||
// Operation parameters (i.e. under paths) remain.
|
||||
if opts.RemoveUnused {
|
||||
removeUnusedShared(&opts)
|
||||
}
|
||||
|
||||
// 4. Import all remote references.
|
||||
if err := importReferences(&opts); err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
// 5. full flattening: rewrite inline schemas (schemas that aren't simple types or arrays or maps)
|
||||
if !opts.Minimal && !opts.Expand {
|
||||
if err := nameInlinedSchemas(&opts); err != nil {
|
||||
return err
|
||||
}
|
||||
}
|
||||
|
||||
// 6. Rewrite JSON pointers other than $ref to named definitions
|
||||
// and attempt to resolve conflicting names whenever possible.
|
||||
if err := stripPointersAndOAIGen(&opts); err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
// 7. Strip the spec from unused definitions
|
||||
if opts.RemoveUnused {
|
||||
removeUnused(&opts)
|
||||
}
|
||||
|
||||
// 8. Issue warning notifications, if any
|
||||
opts.croak()
|
||||
|
||||
// TODO: simplify known schema patterns to flat objects with properties
|
||||
// examples:
|
||||
// - lift simple allOf object,
|
||||
// - empty allOf with validation only or extensions only
|
||||
// - rework allOf arrays
|
||||
// - rework allOf additionalProperties
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
func expand(opts *FlattenOpts) error {
|
||||
if err := spec.ExpandSpec(opts.Swagger(), opts.ExpandOpts(!opts.Expand)); err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
opts.Spec.reload() // re-analyze
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
// normalizeRef strips the current file from any absolute file $ref. This works around issue go-openapi/spec#76:
|
||||
// leading absolute file in $ref is stripped
|
||||
func normalizeRef(opts *FlattenOpts) error {
|
||||
debugLog("normalizeRef")
|
||||
|
||||
altered := false
|
||||
for k, w := range opts.Spec.references.allRefs {
|
||||
if !strings.HasPrefix(w.String(), opts.BasePath+definitionsPath) { // may be a mix of / and \, depending on OS
|
||||
continue
|
||||
}
|
||||
|
||||
altered = true
|
||||
debugLog("stripping absolute path for: %s", w.String())
|
||||
|
||||
// strip the base path from definition
|
||||
if err := replace.UpdateRef(opts.Swagger(), k,
|
||||
spec.MustCreateRef(path.Join(definitionsPath, path.Base(w.String())))); err != nil {
|
||||
return err
|
||||
}
|
||||
}
|
||||
|
||||
if altered {
|
||||
opts.Spec.reload() // re-analyze
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
func removeUnusedShared(opts *FlattenOpts) {
|
||||
opts.Swagger().Parameters = nil
|
||||
opts.Swagger().Responses = nil
|
||||
|
||||
opts.Spec.reload() // re-analyze
|
||||
}
|
||||
|
||||
func importReferences(opts *FlattenOpts) error {
|
||||
var (
|
||||
imported bool
|
||||
err error
|
||||
)
|
||||
|
||||
for !imported && err == nil {
|
||||
// iteratively import remote references until none left.
|
||||
// This inlining deals with name conflicts by introducing auto-generated names ("OAIGen")
|
||||
imported, err = importExternalReferences(opts)
|
||||
|
||||
opts.Spec.reload() // re-analyze
|
||||
}
|
||||
|
||||
return err
|
||||
}
|
||||
|
||||
// nameInlinedSchemas replaces every complex inline construct by a named definition.
|
||||
func nameInlinedSchemas(opts *FlattenOpts) error {
|
||||
debugLog("nameInlinedSchemas")
|
||||
|
||||
namer := &InlineSchemaNamer{
|
||||
Spec: opts.Swagger(),
|
||||
Operations: operations.AllOpRefsByRef(opts.Spec, nil),
|
||||
flattenContext: opts.flattenContext,
|
||||
opts: opts,
|
||||
}
|
||||
|
||||
depthFirst := sortref.DepthFirst(opts.Spec.allSchemas)
|
||||
for _, key := range depthFirst {
|
||||
sch := opts.Spec.allSchemas[key]
|
||||
if sch.Schema == nil || sch.Schema.Ref.String() != "" || sch.TopLevel {
|
||||
continue
|
||||
}
|
||||
|
||||
asch, err := Schema(SchemaOpts{Schema: sch.Schema, Root: opts.Swagger(), BasePath: opts.BasePath})
|
||||
if err != nil {
|
||||
return fmt.Errorf("schema analysis [%s]: %w", key, err)
|
||||
}
|
||||
|
||||
if asch.isAnalyzedAsComplex() { // move complex schemas to definitions
|
||||
if err := namer.Name(key, sch.Schema, asch); err != nil {
|
||||
return err
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
opts.Spec.reload() // re-analyze
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
func removeUnused(opts *FlattenOpts) {
|
||||
expected := make(map[string]struct{})
|
||||
for k := range opts.Swagger().Definitions {
|
||||
expected[path.Join(definitionsPath, jsonpointer.Escape(k))] = struct{}{}
|
||||
}
|
||||
|
||||
for _, k := range opts.Spec.AllDefinitionReferences() {
|
||||
delete(expected, k)
|
||||
}
|
||||
|
||||
for k := range expected {
|
||||
debugLog("removing unused definition %s", path.Base(k))
|
||||
if opts.Verbose {
|
||||
log.Printf("info: removing unused definition: %s", path.Base(k))
|
||||
}
|
||||
delete(opts.Swagger().Definitions, path.Base(k))
|
||||
}
|
||||
|
||||
opts.Spec.reload() // re-analyze
|
||||
}
|
||||
|
||||
func importKnownRef(entry sortref.RefRevIdx, refStr, newName string, opts *FlattenOpts) error {
|
||||
// rewrite ref with already resolved external ref (useful for cyclical refs):
|
||||
// rewrite external refs to local ones
|
||||
debugLog("resolving known ref [%s] to %s", refStr, newName)
|
||||
|
||||
for _, key := range entry.Keys {
|
||||
if err := replace.UpdateRef(opts.Swagger(), key, spec.MustCreateRef(path.Join(definitionsPath, newName))); err != nil {
|
||||
return err
|
||||
}
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
func importNewRef(entry sortref.RefRevIdx, refStr string, opts *FlattenOpts) error {
|
||||
var (
|
||||
isOAIGen bool
|
||||
newName string
|
||||
)
|
||||
|
||||
debugLog("resolving schema from remote $ref [%s]", refStr)
|
||||
|
||||
sch, err := spec.ResolveRefWithBase(opts.Swagger(), &entry.Ref, opts.ExpandOpts(false))
|
||||
if err != nil {
|
||||
return fmt.Errorf("could not resolve schema: %w", err)
|
||||
}
|
||||
|
||||
// at this stage only $ref analysis matters
|
||||
partialAnalyzer := &Spec{
|
||||
references: referenceAnalysis{},
|
||||
patterns: patternAnalysis{},
|
||||
enums: enumAnalysis{},
|
||||
}
|
||||
partialAnalyzer.reset()
|
||||
partialAnalyzer.analyzeSchema("", sch, "/")
|
||||
|
||||
// now rewrite those refs with rebase
|
||||
for key, ref := range partialAnalyzer.references.allRefs {
|
||||
if err := replace.UpdateRef(sch, key, spec.MustCreateRef(normalize.RebaseRef(entry.Ref.String(), ref.String()))); err != nil {
|
||||
return fmt.Errorf("failed to rewrite ref for key %q at %s: %w", key, entry.Ref.String(), err)
|
||||
}
|
||||
}
|
||||
|
||||
// generate a unique name - isOAIGen means that a naming conflict was resolved by changing the name
|
||||
newName, isOAIGen = uniqifyName(opts.Swagger().Definitions, nameFromRef(entry.Ref))
|
||||
debugLog("new name for [%s]: %s - with name conflict:%t", strings.Join(entry.Keys, ", "), newName, isOAIGen)
|
||||
|
||||
opts.flattenContext.resolved[refStr] = newName
|
||||
|
||||
// rewrite the external refs to local ones
|
||||
for _, key := range entry.Keys {
|
||||
if err := replace.UpdateRef(opts.Swagger(), key,
|
||||
spec.MustCreateRef(path.Join(definitionsPath, newName))); err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
// keep track of created refs
|
||||
resolved := false
|
||||
if _, ok := opts.flattenContext.newRefs[key]; ok {
|
||||
resolved = opts.flattenContext.newRefs[key].resolved
|
||||
}
|
||||
|
||||
debugLog("keeping track of ref: %s (%s), resolved: %t", key, newName, resolved)
|
||||
opts.flattenContext.newRefs[key] = &newRef{
|
||||
key: key,
|
||||
newName: newName,
|
||||
path: path.Join(definitionsPath, newName),
|
||||
isOAIGen: isOAIGen,
|
||||
resolved: resolved,
|
||||
schema: sch,
|
||||
}
|
||||
}
|
||||
|
||||
// add the resolved schema to the definitions
|
||||
schutils.Save(opts.Swagger(), newName, sch)
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
// importExternalReferences iteratively digs remote references and imports them into the main schema.
|
||||
//
|
||||
// At every iteration, new remotes may be found when digging deeper: they are rebased to the current schema before being imported.
|
||||
//
|
||||
// This returns true when no more remote references can be found.
|
||||
func importExternalReferences(opts *FlattenOpts) (bool, error) {
|
||||
debugLog("importExternalReferences")
|
||||
|
||||
groupedRefs := sortref.ReverseIndex(opts.Spec.references.schemas, opts.BasePath)
|
||||
sortedRefStr := make([]string, 0, len(groupedRefs))
|
||||
if opts.flattenContext == nil {
|
||||
opts.flattenContext = newContext()
|
||||
}
|
||||
|
||||
// sort $ref resolution to ensure deterministic name conflict resolution
|
||||
for refStr := range groupedRefs {
|
||||
sortedRefStr = append(sortedRefStr, refStr)
|
||||
}
|
||||
sort.Strings(sortedRefStr)
|
||||
|
||||
complete := true
|
||||
|
||||
for _, refStr := range sortedRefStr {
|
||||
entry := groupedRefs[refStr]
|
||||
if entry.Ref.HasFragmentOnly {
|
||||
continue
|
||||
}
|
||||
|
||||
complete = false
|
||||
|
||||
newName := opts.flattenContext.resolved[refStr]
|
||||
if newName != "" {
|
||||
if err := importKnownRef(entry, refStr, newName, opts); err != nil {
|
||||
return false, err
|
||||
}
|
||||
|
||||
continue
|
||||
}
|
||||
|
||||
// resolve schemas
|
||||
if err := importNewRef(entry, refStr, opts); err != nil {
|
||||
return false, err
|
||||
}
|
||||
}
|
||||
|
||||
// maintains ref index entries
|
||||
for k := range opts.flattenContext.newRefs {
|
||||
r := opts.flattenContext.newRefs[k]
|
||||
|
||||
// update tracking with resolved schemas
|
||||
if r.schema.Ref.String() != "" {
|
||||
ref := spec.MustCreateRef(r.path)
|
||||
sch, err := spec.ResolveRefWithBase(opts.Swagger(), &ref, opts.ExpandOpts(false))
|
||||
if err != nil {
|
||||
return false, fmt.Errorf("could not resolve schema: %w", err)
|
||||
}
|
||||
|
||||
r.schema = sch
|
||||
}
|
||||
|
||||
if r.path == k {
|
||||
continue
|
||||
}
|
||||
|
||||
// update tracking with renamed keys: got a cascade of refs
|
||||
renamed := *r
|
||||
renamed.key = r.path
|
||||
opts.flattenContext.newRefs[renamed.path] = &renamed
|
||||
|
||||
// indirect ref
|
||||
r.newName = path.Base(k)
|
||||
r.schema = spec.RefSchema(r.path)
|
||||
r.path = k
|
||||
r.isOAIGen = strings.Contains(k, "OAIGen")
|
||||
}
|
||||
|
||||
return complete, nil
|
||||
}
|
||||
|
||||
// stripPointersAndOAIGen removes anonymous JSON pointers from spec and chain with name conflicts handler.
|
||||
// This loops until the spec has no such pointer and all name conflicts have been reduced as much as possible.
|
||||
func stripPointersAndOAIGen(opts *FlattenOpts) error {
|
||||
// name all JSON pointers to anonymous documents
|
||||
if err := namePointers(opts); err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
// remove unnecessary OAIGen ref (created when flattening external refs creates name conflicts)
|
||||
hasIntroducedPointerOrInline, ers := stripOAIGen(opts)
|
||||
if ers != nil {
|
||||
return ers
|
||||
}
|
||||
|
||||
// iterate as pointer or OAIGen resolution may introduce inline schemas or pointers
|
||||
for hasIntroducedPointerOrInline {
|
||||
if !opts.Minimal {
|
||||
opts.Spec.reload() // re-analyze
|
||||
if err := nameInlinedSchemas(opts); err != nil {
|
||||
return err
|
||||
}
|
||||
}
|
||||
|
||||
if err := namePointers(opts); err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
// restrip and re-analyze
|
||||
var err error
|
||||
if hasIntroducedPointerOrInline, err = stripOAIGen(opts); err != nil {
|
||||
return err
|
||||
}
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
// stripOAIGen strips the spec from unnecessary OAIGen constructs, initially created to dedupe flattened definitions.
|
||||
//
|
||||
// A dedupe is deemed unnecessary whenever:
|
||||
// - the only conflict is with its (single) parent: OAIGen is merged into its parent (reinlining)
|
||||
// - there is a conflict with multiple parents: merge OAIGen in first parent, the rewrite other parents to point to
|
||||
// the first parent.
|
||||
//
|
||||
// This function returns true whenever it re-inlined a complex schema, so the caller may chose to iterate
|
||||
// pointer and name resolution again.
|
||||
func stripOAIGen(opts *FlattenOpts) (bool, error) {
|
||||
debugLog("stripOAIGen")
|
||||
replacedWithComplex := false
|
||||
|
||||
// figure out referers of OAIGen definitions (doing it before the ref start mutating)
|
||||
for _, r := range opts.flattenContext.newRefs {
|
||||
updateRefParents(opts.Spec.references.allRefs, r)
|
||||
}
|
||||
|
||||
for k := range opts.flattenContext.newRefs {
|
||||
r := opts.flattenContext.newRefs[k]
|
||||
debugLog("newRefs[%s]: isOAIGen: %t, resolved: %t, name: %s, path:%s, #parents: %d, parents: %v, ref: %s",
|
||||
k, r.isOAIGen, r.resolved, r.newName, r.path, len(r.parents), r.parents, r.schema.Ref.String())
|
||||
|
||||
if !r.isOAIGen || len(r.parents) == 0 {
|
||||
continue
|
||||
}
|
||||
|
||||
hasReplacedWithComplex, err := stripOAIGenForRef(opts, k, r)
|
||||
if err != nil {
|
||||
return replacedWithComplex, err
|
||||
}
|
||||
|
||||
replacedWithComplex = replacedWithComplex || hasReplacedWithComplex
|
||||
}
|
||||
|
||||
debugLog("replacedWithComplex: %t", replacedWithComplex)
|
||||
opts.Spec.reload() // re-analyze
|
||||
|
||||
return replacedWithComplex, nil
|
||||
}
|
||||
|
||||
// updateRefParents updates all parents of an updated $ref
|
||||
func updateRefParents(allRefs map[string]spec.Ref, r *newRef) {
|
||||
if !r.isOAIGen || r.resolved { // bail on already resolved entries (avoid looping)
|
||||
return
|
||||
}
|
||||
for k, v := range allRefs {
|
||||
if r.path != v.String() {
|
||||
continue
|
||||
}
|
||||
|
||||
found := false
|
||||
for _, p := range r.parents {
|
||||
if p == k {
|
||||
found = true
|
||||
|
||||
break
|
||||
}
|
||||
}
|
||||
if !found {
|
||||
r.parents = append(r.parents, k)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
func stripOAIGenForRef(opts *FlattenOpts, k string, r *newRef) (bool, error) {
|
||||
replacedWithComplex := false
|
||||
|
||||
pr := sortref.TopmostFirst(r.parents)
|
||||
|
||||
// rewrite first parent schema in hierarchical then lexicographical order
|
||||
debugLog("rewrite first parent %s with schema", pr[0])
|
||||
if err := replace.UpdateRefWithSchema(opts.Swagger(), pr[0], r.schema); err != nil {
|
||||
return false, err
|
||||
}
|
||||
|
||||
if pa, ok := opts.flattenContext.newRefs[pr[0]]; ok && pa.isOAIGen {
|
||||
// update parent in ref index entry
|
||||
debugLog("update parent entry: %s", pr[0])
|
||||
pa.schema = r.schema
|
||||
pa.resolved = false
|
||||
replacedWithComplex = true
|
||||
}
|
||||
|
||||
// rewrite other parents to point to first parent
|
||||
if len(pr) > 1 {
|
||||
for _, p := range pr[1:] {
|
||||
replacingRef := spec.MustCreateRef(pr[0])
|
||||
|
||||
// set complex when replacing ref is an anonymous jsonpointer: further processing may be required
|
||||
replacedWithComplex = replacedWithComplex || path.Dir(replacingRef.String()) != definitionsPath
|
||||
debugLog("rewrite parent with ref: %s", replacingRef.String())
|
||||
|
||||
// NOTE: it is possible at this stage to introduce json pointers (to non-definitions places).
|
||||
// Those are stripped later on.
|
||||
if err := replace.UpdateRef(opts.Swagger(), p, replacingRef); err != nil {
|
||||
return false, err
|
||||
}
|
||||
|
||||
if pa, ok := opts.flattenContext.newRefs[p]; ok && pa.isOAIGen {
|
||||
// update parent in ref index
|
||||
debugLog("update parent entry: %s", p)
|
||||
pa.schema = r.schema
|
||||
pa.resolved = false
|
||||
replacedWithComplex = true
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// remove OAIGen definition
|
||||
debugLog("removing definition %s", path.Base(r.path))
|
||||
delete(opts.Swagger().Definitions, path.Base(r.path))
|
||||
|
||||
// propagate changes in ref index for keys which have this one as a parent
|
||||
for kk, value := range opts.flattenContext.newRefs {
|
||||
if kk == k || !value.isOAIGen || value.resolved {
|
||||
continue
|
||||
}
|
||||
|
||||
found := false
|
||||
newParents := make([]string, 0, len(value.parents))
|
||||
for _, parent := range value.parents {
|
||||
switch {
|
||||
case parent == r.path:
|
||||
found = true
|
||||
parent = pr[0]
|
||||
case strings.HasPrefix(parent, r.path+"/"):
|
||||
found = true
|
||||
parent = path.Join(pr[0], strings.TrimPrefix(parent, r.path))
|
||||
}
|
||||
|
||||
newParents = append(newParents, parent)
|
||||
}
|
||||
|
||||
if found {
|
||||
value.parents = newParents
|
||||
}
|
||||
}
|
||||
|
||||
// mark naming conflict as resolved
|
||||
debugLog("marking naming conflict resolved for key: %s", r.key)
|
||||
opts.flattenContext.newRefs[r.key].isOAIGen = false
|
||||
opts.flattenContext.newRefs[r.key].resolved = true
|
||||
|
||||
// determine if the previous substitution did inline a complex schema
|
||||
if r.schema != nil && r.schema.Ref.String() == "" { // inline schema
|
||||
asch, err := Schema(SchemaOpts{Schema: r.schema, Root: opts.Swagger(), BasePath: opts.BasePath})
|
||||
if err != nil {
|
||||
return false, err
|
||||
}
|
||||
|
||||
debugLog("re-inlined schema: parent: %s, %t", pr[0], asch.isAnalyzedAsComplex())
|
||||
replacedWithComplex = replacedWithComplex || !(path.Dir(pr[0]) == definitionsPath) && asch.isAnalyzedAsComplex()
|
||||
}
|
||||
|
||||
return replacedWithComplex, nil
|
||||
}
|
||||
|
||||
// namePointers replaces all JSON pointers to anonymous documents by a $ref to a new named definitions.
|
||||
//
|
||||
// This is carried on depth-first. Pointers to $refs which are top level definitions are replaced by the $ref itself.
|
||||
// Pointers to simple types are expanded, unless they express commonality (i.e. several such $ref are used).
|
||||
func namePointers(opts *FlattenOpts) error {
|
||||
debugLog("name pointers")
|
||||
|
||||
refsToReplace := make(map[string]SchemaRef, len(opts.Spec.references.schemas))
|
||||
for k, ref := range opts.Spec.references.allRefs {
|
||||
if path.Dir(ref.String()) == definitionsPath {
|
||||
// this a ref to a top-level definition: ok
|
||||
continue
|
||||
}
|
||||
|
||||
result, err := replace.DeepestRef(opts.Swagger(), opts.ExpandOpts(false), ref)
|
||||
if err != nil {
|
||||
return fmt.Errorf("at %s, %w", k, err)
|
||||
}
|
||||
|
||||
replacingRef := result.Ref
|
||||
sch := result.Schema
|
||||
if opts.flattenContext != nil {
|
||||
opts.flattenContext.warnings = append(opts.flattenContext.warnings, result.Warnings...)
|
||||
}
|
||||
|
||||
debugLog("planning pointer to replace at %s: %s, resolved to: %s", k, ref.String(), replacingRef.String())
|
||||
refsToReplace[k] = SchemaRef{
|
||||
Name: k, // caller
|
||||
Ref: replacingRef, // called
|
||||
Schema: sch,
|
||||
TopLevel: path.Dir(replacingRef.String()) == definitionsPath,
|
||||
}
|
||||
}
|
||||
|
||||
depthFirst := sortref.DepthFirst(refsToReplace)
|
||||
namer := &InlineSchemaNamer{
|
||||
Spec: opts.Swagger(),
|
||||
Operations: operations.AllOpRefsByRef(opts.Spec, nil),
|
||||
flattenContext: opts.flattenContext,
|
||||
opts: opts,
|
||||
}
|
||||
|
||||
for _, key := range depthFirst {
|
||||
v := refsToReplace[key]
|
||||
// update current replacement, which may have been updated by previous changes of deeper elements
|
||||
result, erd := replace.DeepestRef(opts.Swagger(), opts.ExpandOpts(false), v.Ref)
|
||||
if erd != nil {
|
||||
return fmt.Errorf("at %s, %w", key, erd)
|
||||
}
|
||||
|
||||
if opts.flattenContext != nil {
|
||||
opts.flattenContext.warnings = append(opts.flattenContext.warnings, result.Warnings...)
|
||||
}
|
||||
|
||||
v.Ref = result.Ref
|
||||
v.Schema = result.Schema
|
||||
v.TopLevel = path.Dir(result.Ref.String()) == definitionsPath
|
||||
debugLog("replacing pointer at %s: resolved to: %s", key, v.Ref.String())
|
||||
|
||||
if v.TopLevel {
|
||||
debugLog("replace pointer %s by canonical definition: %s", key, v.Ref.String())
|
||||
|
||||
// if the schema is a $ref to a top level definition, just rewrite the pointer to this $ref
|
||||
if err := replace.UpdateRef(opts.Swagger(), key, v.Ref); err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
continue
|
||||
}
|
||||
|
||||
if err := flattenAnonPointer(key, v, refsToReplace, namer, opts); err != nil {
|
||||
return err
|
||||
}
|
||||
}
|
||||
|
||||
opts.Spec.reload() // re-analyze
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
func flattenAnonPointer(key string, v SchemaRef, refsToReplace map[string]SchemaRef, namer *InlineSchemaNamer, opts *FlattenOpts) error {
|
||||
// this is a JSON pointer to an anonymous document (internal or external):
|
||||
// create a definition for this schema when:
|
||||
// - it is a complex schema
|
||||
// - or it is pointed by more than one $ref (i.e. expresses commonality)
|
||||
// otherwise, expand the pointer (single reference to a simple type)
|
||||
//
|
||||
// The named definition for this follows the target's key, not the caller's
|
||||
debugLog("namePointers at %s for %s", key, v.Ref.String())
|
||||
|
||||
// qualify the expanded schema
|
||||
asch, ers := Schema(SchemaOpts{Schema: v.Schema, Root: opts.Swagger(), BasePath: opts.BasePath})
|
||||
if ers != nil {
|
||||
return fmt.Errorf("schema analysis [%s]: %w", key, ers)
|
||||
}
|
||||
callers := make([]string, 0, 64)
|
||||
|
||||
debugLog("looking for callers")
|
||||
|
||||
an := New(opts.Swagger())
|
||||
for k, w := range an.references.allRefs {
|
||||
r, err := replace.DeepestRef(opts.Swagger(), opts.ExpandOpts(false), w)
|
||||
if err != nil {
|
||||
return fmt.Errorf("at %s, %w", key, err)
|
||||
}
|
||||
|
||||
if opts.flattenContext != nil {
|
||||
opts.flattenContext.warnings = append(opts.flattenContext.warnings, r.Warnings...)
|
||||
}
|
||||
|
||||
if r.Ref.String() == v.Ref.String() {
|
||||
callers = append(callers, k)
|
||||
}
|
||||
}
|
||||
|
||||
debugLog("callers for %s: %d", v.Ref.String(), len(callers))
|
||||
if len(callers) == 0 {
|
||||
// has already been updated and resolved
|
||||
return nil
|
||||
}
|
||||
|
||||
parts := sortref.KeyParts(v.Ref.String())
|
||||
debugLog("number of callers for %s: %d", v.Ref.String(), len(callers))
|
||||
|
||||
// identifying edge case when the namer did nothing because we point to a non-schema object
|
||||
// no definition is created and we expand the $ref for all callers
|
||||
if (!asch.IsSimpleSchema || len(callers) > 1) && !parts.IsSharedParam() && !parts.IsSharedResponse() {
|
||||
debugLog("replace JSON pointer at [%s] by definition: %s", key, v.Ref.String())
|
||||
if err := namer.Name(v.Ref.String(), v.Schema, asch); err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
// regular case: we named the $ref as a definition, and we move all callers to this new $ref
|
||||
for _, caller := range callers {
|
||||
if caller == key {
|
||||
continue
|
||||
}
|
||||
|
||||
// move $ref for next to resolve
|
||||
debugLog("identified caller of %s at [%s]", v.Ref.String(), caller)
|
||||
c := refsToReplace[caller]
|
||||
c.Ref = v.Ref
|
||||
refsToReplace[caller] = c
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
debugLog("expand JSON pointer for key=%s", key)
|
||||
|
||||
if err := replace.UpdateRefWithSchema(opts.Swagger(), key, v.Schema); err != nil {
|
||||
return err
|
||||
}
|
||||
// NOTE: there is no other caller to update
|
||||
|
||||
return nil
|
||||
}
|
||||
293
vendor/github.com/go-openapi/analysis/flatten_name.go
generated
vendored
Normal file
293
vendor/github.com/go-openapi/analysis/flatten_name.go
generated
vendored
Normal file
|
|
@ -0,0 +1,293 @@
|
|||
package analysis
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"path"
|
||||
"sort"
|
||||
"strings"
|
||||
|
||||
"github.com/go-openapi/analysis/internal/flatten/operations"
|
||||
"github.com/go-openapi/analysis/internal/flatten/replace"
|
||||
"github.com/go-openapi/analysis/internal/flatten/schutils"
|
||||
"github.com/go-openapi/analysis/internal/flatten/sortref"
|
||||
"github.com/go-openapi/spec"
|
||||
"github.com/go-openapi/swag"
|
||||
)
|
||||
|
||||
// InlineSchemaNamer finds a new name for an inlined type
|
||||
type InlineSchemaNamer struct {
|
||||
Spec *spec.Swagger
|
||||
Operations map[string]operations.OpRef
|
||||
flattenContext *context
|
||||
opts *FlattenOpts
|
||||
}
|
||||
|
||||
// Name yields a new name for the inline schema
|
||||
func (isn *InlineSchemaNamer) Name(key string, schema *spec.Schema, aschema *AnalyzedSchema) error {
|
||||
debugLog("naming inlined schema at %s", key)
|
||||
|
||||
parts := sortref.KeyParts(key)
|
||||
for _, name := range namesFromKey(parts, aschema, isn.Operations) {
|
||||
if name == "" {
|
||||
continue
|
||||
}
|
||||
|
||||
// create unique name
|
||||
newName, isOAIGen := uniqifyName(isn.Spec.Definitions, swag.ToJSONName(name))
|
||||
|
||||
// clone schema
|
||||
sch := schutils.Clone(schema)
|
||||
|
||||
// replace values on schema
|
||||
if err := replace.RewriteSchemaToRef(isn.Spec, key,
|
||||
spec.MustCreateRef(path.Join(definitionsPath, newName))); err != nil {
|
||||
return fmt.Errorf("error while creating definition %q from inline schema: %w", newName, err)
|
||||
}
|
||||
|
||||
// rewrite any dependent $ref pointing to this place,
|
||||
// when not already pointing to a top-level definition.
|
||||
//
|
||||
// NOTE: this is important if such referers use arbitrary JSON pointers.
|
||||
an := New(isn.Spec)
|
||||
for k, v := range an.references.allRefs {
|
||||
r, erd := replace.DeepestRef(isn.opts.Swagger(), isn.opts.ExpandOpts(false), v)
|
||||
if erd != nil {
|
||||
return fmt.Errorf("at %s, %w", k, erd)
|
||||
}
|
||||
|
||||
if isn.opts.flattenContext != nil {
|
||||
isn.opts.flattenContext.warnings = append(isn.opts.flattenContext.warnings, r.Warnings...)
|
||||
}
|
||||
|
||||
if r.Ref.String() != key && (r.Ref.String() != path.Join(definitionsPath, newName) || path.Dir(v.String()) == definitionsPath) {
|
||||
continue
|
||||
}
|
||||
|
||||
debugLog("found a $ref to a rewritten schema: %s points to %s", k, v.String())
|
||||
|
||||
// rewrite $ref to the new target
|
||||
if err := replace.UpdateRef(isn.Spec, k,
|
||||
spec.MustCreateRef(path.Join(definitionsPath, newName))); err != nil {
|
||||
return err
|
||||
}
|
||||
}
|
||||
|
||||
// NOTE: this extension is currently not used by go-swagger (provided for information only)
|
||||
sch.AddExtension("x-go-gen-location", GenLocation(parts))
|
||||
|
||||
// save cloned schema to definitions
|
||||
schutils.Save(isn.Spec, newName, sch)
|
||||
|
||||
// keep track of created refs
|
||||
if isn.flattenContext == nil {
|
||||
continue
|
||||
}
|
||||
|
||||
debugLog("track created ref: key=%s, newName=%s, isOAIGen=%t", key, newName, isOAIGen)
|
||||
resolved := false
|
||||
|
||||
if _, ok := isn.flattenContext.newRefs[key]; ok {
|
||||
resolved = isn.flattenContext.newRefs[key].resolved
|
||||
}
|
||||
|
||||
isn.flattenContext.newRefs[key] = &newRef{
|
||||
key: key,
|
||||
newName: newName,
|
||||
path: path.Join(definitionsPath, newName),
|
||||
isOAIGen: isOAIGen,
|
||||
resolved: resolved,
|
||||
schema: sch,
|
||||
}
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
// uniqifyName yields a unique name for a definition
|
||||
func uniqifyName(definitions spec.Definitions, name string) (string, bool) {
|
||||
isOAIGen := false
|
||||
if name == "" {
|
||||
name = "oaiGen"
|
||||
isOAIGen = true
|
||||
}
|
||||
|
||||
if len(definitions) == 0 {
|
||||
return name, isOAIGen
|
||||
}
|
||||
|
||||
unq := true
|
||||
for k := range definitions {
|
||||
if strings.EqualFold(k, name) {
|
||||
unq = false
|
||||
|
||||
break
|
||||
}
|
||||
}
|
||||
|
||||
if unq {
|
||||
return name, isOAIGen
|
||||
}
|
||||
|
||||
name += "OAIGen"
|
||||
isOAIGen = true
|
||||
var idx int
|
||||
unique := name
|
||||
_, known := definitions[unique]
|
||||
|
||||
for known {
|
||||
idx++
|
||||
unique = fmt.Sprintf("%s%d", name, idx)
|
||||
_, known = definitions[unique]
|
||||
}
|
||||
|
||||
return unique, isOAIGen
|
||||
}
|
||||
|
||||
func namesFromKey(parts sortref.SplitKey, aschema *AnalyzedSchema, operations map[string]operations.OpRef) []string {
|
||||
var (
|
||||
baseNames [][]string
|
||||
startIndex int
|
||||
)
|
||||
|
||||
if parts.IsOperation() {
|
||||
baseNames, startIndex = namesForOperation(parts, operations)
|
||||
}
|
||||
|
||||
// definitions
|
||||
if parts.IsDefinition() {
|
||||
baseNames, startIndex = namesForDefinition(parts)
|
||||
}
|
||||
|
||||
result := make([]string, 0, len(baseNames))
|
||||
for _, segments := range baseNames {
|
||||
nm := parts.BuildName(segments, startIndex, partAdder(aschema))
|
||||
if nm == "" {
|
||||
continue
|
||||
}
|
||||
|
||||
result = append(result, nm)
|
||||
}
|
||||
sort.Strings(result)
|
||||
|
||||
return result
|
||||
}
|
||||
|
||||
func namesForParam(parts sortref.SplitKey, operations map[string]operations.OpRef) ([][]string, int) {
|
||||
var (
|
||||
baseNames [][]string
|
||||
startIndex int
|
||||
)
|
||||
|
||||
piref := parts.PathItemRef()
|
||||
if piref.String() != "" && parts.IsOperationParam() {
|
||||
if op, ok := operations[piref.String()]; ok {
|
||||
startIndex = 5
|
||||
baseNames = append(baseNames, []string{op.ID, "params", "body"})
|
||||
}
|
||||
} else if parts.IsSharedOperationParam() {
|
||||
pref := parts.PathRef()
|
||||
for k, v := range operations {
|
||||
if strings.HasPrefix(k, pref.String()) {
|
||||
startIndex = 4
|
||||
baseNames = append(baseNames, []string{v.ID, "params", "body"})
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return baseNames, startIndex
|
||||
}
|
||||
|
||||
func namesForOperation(parts sortref.SplitKey, operations map[string]operations.OpRef) ([][]string, int) {
|
||||
var (
|
||||
baseNames [][]string
|
||||
startIndex int
|
||||
)
|
||||
|
||||
// params
|
||||
if parts.IsOperationParam() || parts.IsSharedOperationParam() {
|
||||
baseNames, startIndex = namesForParam(parts, operations)
|
||||
}
|
||||
|
||||
// responses
|
||||
if parts.IsOperationResponse() {
|
||||
piref := parts.PathItemRef()
|
||||
if piref.String() != "" {
|
||||
if op, ok := operations[piref.String()]; ok {
|
||||
startIndex = 6
|
||||
baseNames = append(baseNames, []string{op.ID, parts.ResponseName(), "body"})
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return baseNames, startIndex
|
||||
}
|
||||
|
||||
func namesForDefinition(parts sortref.SplitKey) ([][]string, int) {
|
||||
nm := parts.DefinitionName()
|
||||
if nm != "" {
|
||||
return [][]string{{parts.DefinitionName()}}, 2
|
||||
}
|
||||
|
||||
return [][]string{}, 0
|
||||
}
|
||||
|
||||
// partAdder knows how to interpret a schema when it comes to build a name from parts
|
||||
func partAdder(aschema *AnalyzedSchema) sortref.PartAdder {
|
||||
return func(part string) []string {
|
||||
segments := make([]string, 0, 2)
|
||||
|
||||
if part == "items" || part == "additionalItems" {
|
||||
if aschema.IsTuple || aschema.IsTupleWithExtra {
|
||||
segments = append(segments, "tuple")
|
||||
} else {
|
||||
segments = append(segments, "items")
|
||||
}
|
||||
|
||||
if part == "additionalItems" {
|
||||
segments = append(segments, part)
|
||||
}
|
||||
|
||||
return segments
|
||||
}
|
||||
|
||||
segments = append(segments, part)
|
||||
|
||||
return segments
|
||||
}
|
||||
}
|
||||
|
||||
func nameFromRef(ref spec.Ref) string {
|
||||
u := ref.GetURL()
|
||||
if u.Fragment != "" {
|
||||
return swag.ToJSONName(path.Base(u.Fragment))
|
||||
}
|
||||
|
||||
if u.Path != "" {
|
||||
bn := path.Base(u.Path)
|
||||
if bn != "" && bn != "/" {
|
||||
ext := path.Ext(bn)
|
||||
if ext != "" {
|
||||
return swag.ToJSONName(bn[:len(bn)-len(ext)])
|
||||
}
|
||||
|
||||
return swag.ToJSONName(bn)
|
||||
}
|
||||
}
|
||||
|
||||
return swag.ToJSONName(strings.ReplaceAll(u.Host, ".", " "))
|
||||
}
|
||||
|
||||
// GenLocation indicates from which section of the specification (models or operations) a definition has been created.
|
||||
//
|
||||
// This is reflected in the output spec with a "x-go-gen-location" extension. At the moment, this is is provided
|
||||
// for information only.
|
||||
func GenLocation(parts sortref.SplitKey) string {
|
||||
switch {
|
||||
case parts.IsOperation():
|
||||
return "operations"
|
||||
case parts.IsDefinition():
|
||||
return "models"
|
||||
default:
|
||||
return ""
|
||||
}
|
||||
}
|
||||
78
vendor/github.com/go-openapi/analysis/flatten_options.go
generated
vendored
Normal file
78
vendor/github.com/go-openapi/analysis/flatten_options.go
generated
vendored
Normal file
|
|
@ -0,0 +1,78 @@
|
|||
package analysis
|
||||
|
||||
import (
|
||||
"log"
|
||||
|
||||
"github.com/go-openapi/spec"
|
||||
)
|
||||
|
||||
// FlattenOpts configuration for flattening a swagger specification.
|
||||
//
|
||||
// The BasePath parameter is used to locate remote relative $ref found in the specification.
|
||||
// This path is a file: it points to the location of the root document and may be either a local
|
||||
// file path or a URL.
|
||||
//
|
||||
// If none specified, relative references (e.g. "$ref": "folder/schema.yaml#/definitions/...")
|
||||
// found in the spec are searched from the current working directory.
|
||||
type FlattenOpts struct {
|
||||
Spec *Spec // The analyzed spec to work with
|
||||
flattenContext *context // Internal context to track flattening activity
|
||||
|
||||
BasePath string // The location of the root document for this spec to resolve relative $ref
|
||||
|
||||
// Flattening options
|
||||
Expand bool // When true, skip flattening the spec and expand it instead (if Minimal is false)
|
||||
Minimal bool // When true, do not decompose complex structures such as allOf
|
||||
Verbose bool // enable some reporting on possible name conflicts detected
|
||||
RemoveUnused bool // When true, remove unused parameters, responses and definitions after expansion/flattening
|
||||
ContinueOnError bool // Continue when spec expansion issues are found
|
||||
|
||||
/* Extra keys */
|
||||
_ struct{} // require keys
|
||||
}
|
||||
|
||||
// ExpandOpts creates a spec.ExpandOptions to configure expanding a specification document.
|
||||
func (f *FlattenOpts) ExpandOpts(skipSchemas bool) *spec.ExpandOptions {
|
||||
return &spec.ExpandOptions{
|
||||
RelativeBase: f.BasePath,
|
||||
SkipSchemas: skipSchemas,
|
||||
ContinueOnError: f.ContinueOnError,
|
||||
}
|
||||
}
|
||||
|
||||
// Swagger gets the swagger specification for this flatten operation
|
||||
func (f *FlattenOpts) Swagger() *spec.Swagger {
|
||||
return f.Spec.spec
|
||||
}
|
||||
|
||||
// croak logs notifications and warnings about valid, but possibly unwanted constructs resulting
|
||||
// from flattening a spec
|
||||
func (f *FlattenOpts) croak() {
|
||||
if !f.Verbose {
|
||||
return
|
||||
}
|
||||
|
||||
reported := make(map[string]bool, len(f.flattenContext.newRefs))
|
||||
for _, v := range f.Spec.references.allRefs {
|
||||
// warns about duplicate handling
|
||||
for _, r := range f.flattenContext.newRefs {
|
||||
if r.isOAIGen && r.path == v.String() {
|
||||
reported[r.newName] = true
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
for k := range reported {
|
||||
log.Printf("warning: duplicate flattened definition name resolved as %s", k)
|
||||
}
|
||||
|
||||
// warns about possible type mismatches
|
||||
uniqueMsg := make(map[string]bool)
|
||||
for _, msg := range f.flattenContext.warnings {
|
||||
if _, ok := uniqueMsg[msg]; ok {
|
||||
continue
|
||||
}
|
||||
log.Printf("warning: %s", msg)
|
||||
uniqueMsg[msg] = true
|
||||
}
|
||||
}
|
||||
41
vendor/github.com/go-openapi/analysis/internal/debug/debug.go
generated
vendored
Normal file
41
vendor/github.com/go-openapi/analysis/internal/debug/debug.go
generated
vendored
Normal file
|
|
@ -0,0 +1,41 @@
|
|||
// Copyright 2015 go-swagger maintainers
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 (the "License");
|
||||
// you may not use this file except in compliance with the License.
|
||||
// You may obtain a copy of the License at
|
||||
//
|
||||
// http://www.apache.org/licenses/LICENSE-2.0
|
||||
//
|
||||
// Unless required by applicable law or agreed to in writing, software
|
||||
// distributed under the License is distributed on an "AS IS" BASIS,
|
||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
// See the License for the specific language governing permissions and
|
||||
// limitations under the License.
|
||||
|
||||
package debug
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"log"
|
||||
"os"
|
||||
"path/filepath"
|
||||
"runtime"
|
||||
)
|
||||
|
||||
var (
|
||||
output = os.Stdout
|
||||
)
|
||||
|
||||
// GetLogger provides a prefix debug logger
|
||||
func GetLogger(prefix string, debug bool) func(string, ...interface{}) {
|
||||
if debug {
|
||||
logger := log.New(output, fmt.Sprintf("%s:", prefix), log.LstdFlags)
|
||||
|
||||
return func(msg string, args ...interface{}) {
|
||||
_, file1, pos1, _ := runtime.Caller(1)
|
||||
logger.Printf("%s:%d: %s", filepath.Base(file1), pos1, fmt.Sprintf(msg, args...))
|
||||
}
|
||||
}
|
||||
|
||||
return func(msg string, args ...interface{}) {}
|
||||
}
|
||||
87
vendor/github.com/go-openapi/analysis/internal/flatten/normalize/normalize.go
generated
vendored
Normal file
87
vendor/github.com/go-openapi/analysis/internal/flatten/normalize/normalize.go
generated
vendored
Normal file
|
|
@ -0,0 +1,87 @@
|
|||
package normalize
|
||||
|
||||
import (
|
||||
"net/url"
|
||||
"path"
|
||||
"path/filepath"
|
||||
"strings"
|
||||
|
||||
"github.com/go-openapi/spec"
|
||||
)
|
||||
|
||||
// RebaseRef rebases a remote ref relative to a base ref.
|
||||
//
|
||||
// NOTE: does not support JSONschema ID for $ref (we assume we are working with swagger specs here).
|
||||
//
|
||||
// NOTE(windows):
|
||||
// * refs are assumed to have been normalized with drive letter lower cased (from go-openapi/spec)
|
||||
// * "/ in paths may appear as escape sequences
|
||||
func RebaseRef(baseRef string, ref string) string {
|
||||
baseRef, _ = url.PathUnescape(baseRef)
|
||||
ref, _ = url.PathUnescape(ref)
|
||||
|
||||
if baseRef == "" || baseRef == "." || strings.HasPrefix(baseRef, "#") {
|
||||
return ref
|
||||
}
|
||||
|
||||
parts := strings.Split(ref, "#")
|
||||
|
||||
baseParts := strings.Split(baseRef, "#")
|
||||
baseURL, _ := url.Parse(baseParts[0])
|
||||
if strings.HasPrefix(ref, "#") {
|
||||
if baseURL.Host == "" {
|
||||
return strings.Join([]string{baseParts[0], parts[1]}, "#")
|
||||
}
|
||||
|
||||
return strings.Join([]string{baseParts[0], parts[1]}, "#")
|
||||
}
|
||||
|
||||
refURL, _ := url.Parse(parts[0])
|
||||
if refURL.Host != "" || filepath.IsAbs(parts[0]) {
|
||||
// not rebasing an absolute path
|
||||
return ref
|
||||
}
|
||||
|
||||
// there is a relative path
|
||||
var basePath string
|
||||
if baseURL.Host != "" {
|
||||
// when there is a host, standard URI rules apply (with "/")
|
||||
baseURL.Path = path.Dir(baseURL.Path)
|
||||
baseURL.Path = path.Join(baseURL.Path, "/"+parts[0])
|
||||
|
||||
return baseURL.String()
|
||||
}
|
||||
|
||||
// this is a local relative path
|
||||
// basePart[0] and parts[0] are local filesystem directories/files
|
||||
basePath = filepath.Dir(baseParts[0])
|
||||
relPath := filepath.Join(basePath, string(filepath.Separator)+parts[0])
|
||||
if len(parts) > 1 {
|
||||
return strings.Join([]string{relPath, parts[1]}, "#")
|
||||
}
|
||||
|
||||
return relPath
|
||||
}
|
||||
|
||||
// Path renders absolute path on remote file refs
|
||||
//
|
||||
// NOTE(windows):
|
||||
// * refs are assumed to have been normalized with drive letter lower cased (from go-openapi/spec)
|
||||
// * "/ in paths may appear as escape sequences
|
||||
func Path(ref spec.Ref, basePath string) string {
|
||||
uri, _ := url.PathUnescape(ref.String())
|
||||
if ref.HasFragmentOnly || filepath.IsAbs(uri) {
|
||||
return uri
|
||||
}
|
||||
|
||||
refURL, _ := url.Parse(uri)
|
||||
if refURL.Host != "" {
|
||||
return uri
|
||||
}
|
||||
|
||||
parts := strings.Split(uri, "#")
|
||||
// BasePath, parts[0] are local filesystem directories, guaranteed to be absolute at this stage
|
||||
parts[0] = filepath.Join(filepath.Dir(basePath), parts[0])
|
||||
|
||||
return strings.Join(parts, "#")
|
||||
}
|
||||
90
vendor/github.com/go-openapi/analysis/internal/flatten/operations/operations.go
generated
vendored
Normal file
90
vendor/github.com/go-openapi/analysis/internal/flatten/operations/operations.go
generated
vendored
Normal file
|
|
@ -0,0 +1,90 @@
|
|||
package operations
|
||||
|
||||
import (
|
||||
"path"
|
||||
"sort"
|
||||
"strings"
|
||||
|
||||
"github.com/go-openapi/jsonpointer"
|
||||
"github.com/go-openapi/spec"
|
||||
"github.com/go-openapi/swag"
|
||||
)
|
||||
|
||||
// AllOpRefsByRef returns an index of sortable operations
|
||||
func AllOpRefsByRef(specDoc Provider, operationIDs []string) map[string]OpRef {
|
||||
return OpRefsByRef(GatherOperations(specDoc, operationIDs))
|
||||
}
|
||||
|
||||
// OpRefsByRef indexes a map of sortable operations
|
||||
func OpRefsByRef(oprefs map[string]OpRef) map[string]OpRef {
|
||||
result := make(map[string]OpRef, len(oprefs))
|
||||
for _, v := range oprefs {
|
||||
result[v.Ref.String()] = v
|
||||
}
|
||||
|
||||
return result
|
||||
}
|
||||
|
||||
// OpRef is an indexable, sortable operation
|
||||
type OpRef struct {
|
||||
Method string
|
||||
Path string
|
||||
Key string
|
||||
ID string
|
||||
Op *spec.Operation
|
||||
Ref spec.Ref
|
||||
}
|
||||
|
||||
// OpRefs is a sortable collection of operations
|
||||
type OpRefs []OpRef
|
||||
|
||||
func (o OpRefs) Len() int { return len(o) }
|
||||
func (o OpRefs) Swap(i, j int) { o[i], o[j] = o[j], o[i] }
|
||||
func (o OpRefs) Less(i, j int) bool { return o[i].Key < o[j].Key }
|
||||
|
||||
// Provider knows how to collect operations from a spec
|
||||
type Provider interface {
|
||||
Operations() map[string]map[string]*spec.Operation
|
||||
}
|
||||
|
||||
// GatherOperations builds a map of sorted operations from a spec
|
||||
func GatherOperations(specDoc Provider, operationIDs []string) map[string]OpRef {
|
||||
var oprefs OpRefs
|
||||
|
||||
for method, pathItem := range specDoc.Operations() {
|
||||
for pth, operation := range pathItem {
|
||||
vv := *operation
|
||||
oprefs = append(oprefs, OpRef{
|
||||
Key: swag.ToGoName(strings.ToLower(method) + " " + pth),
|
||||
Method: method,
|
||||
Path: pth,
|
||||
ID: vv.ID,
|
||||
Op: &vv,
|
||||
Ref: spec.MustCreateRef("#" + path.Join("/paths", jsonpointer.Escape(pth), method)),
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
sort.Sort(oprefs)
|
||||
|
||||
operations := make(map[string]OpRef)
|
||||
for _, opr := range oprefs {
|
||||
nm := opr.ID
|
||||
if nm == "" {
|
||||
nm = opr.Key
|
||||
}
|
||||
|
||||
oo, found := operations[nm]
|
||||
if found && oo.Method != opr.Method && oo.Path != opr.Path {
|
||||
nm = opr.Key
|
||||
}
|
||||
|
||||
if len(operationIDs) == 0 || swag.ContainsStrings(operationIDs, opr.ID) || swag.ContainsStrings(operationIDs, nm) {
|
||||
opr.ID = nm
|
||||
opr.Op.ID = nm
|
||||
operations[nm] = opr
|
||||
}
|
||||
}
|
||||
|
||||
return operations
|
||||
}
|
||||
434
vendor/github.com/go-openapi/analysis/internal/flatten/replace/replace.go
generated
vendored
Normal file
434
vendor/github.com/go-openapi/analysis/internal/flatten/replace/replace.go
generated
vendored
Normal file
|
|
@ -0,0 +1,434 @@
|
|||
package replace
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"net/url"
|
||||
"os"
|
||||
"path"
|
||||
"strconv"
|
||||
|
||||
"github.com/go-openapi/analysis/internal/debug"
|
||||
"github.com/go-openapi/jsonpointer"
|
||||
"github.com/go-openapi/spec"
|
||||
)
|
||||
|
||||
const definitionsPath = "#/definitions"
|
||||
|
||||
var debugLog = debug.GetLogger("analysis/flatten/replace", os.Getenv("SWAGGER_DEBUG") != "")
|
||||
|
||||
// RewriteSchemaToRef replaces a schema with a Ref
|
||||
func RewriteSchemaToRef(sp *spec.Swagger, key string, ref spec.Ref) error {
|
||||
debugLog("rewriting schema to ref for %s with %s", key, ref.String())
|
||||
_, value, err := getPointerFromKey(sp, key)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
switch refable := value.(type) {
|
||||
case *spec.Schema:
|
||||
return rewriteParentRef(sp, key, ref)
|
||||
|
||||
case spec.Schema:
|
||||
return rewriteParentRef(sp, key, ref)
|
||||
|
||||
case *spec.SchemaOrArray:
|
||||
if refable.Schema != nil {
|
||||
refable.Schema = &spec.Schema{SchemaProps: spec.SchemaProps{Ref: ref}}
|
||||
}
|
||||
|
||||
case *spec.SchemaOrBool:
|
||||
if refable.Schema != nil {
|
||||
refable.Schema = &spec.Schema{SchemaProps: spec.SchemaProps{Ref: ref}}
|
||||
}
|
||||
default:
|
||||
return fmt.Errorf("no schema with ref found at %s for %T", key, value)
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
func rewriteParentRef(sp *spec.Swagger, key string, ref spec.Ref) error {
|
||||
parent, entry, pvalue, err := getParentFromKey(sp, key)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
debugLog("rewriting holder for %T", pvalue)
|
||||
switch container := pvalue.(type) {
|
||||
case spec.Response:
|
||||
if err := rewriteParentRef(sp, "#"+parent, ref); err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
case *spec.Response:
|
||||
container.Schema = &spec.Schema{SchemaProps: spec.SchemaProps{Ref: ref}}
|
||||
|
||||
case *spec.Responses:
|
||||
statusCode, err := strconv.Atoi(entry)
|
||||
if err != nil {
|
||||
return fmt.Errorf("%s not a number: %w", key[1:], err)
|
||||
}
|
||||
resp := container.StatusCodeResponses[statusCode]
|
||||
resp.Schema = &spec.Schema{SchemaProps: spec.SchemaProps{Ref: ref}}
|
||||
container.StatusCodeResponses[statusCode] = resp
|
||||
|
||||
case map[string]spec.Response:
|
||||
resp := container[entry]
|
||||
resp.Schema = &spec.Schema{SchemaProps: spec.SchemaProps{Ref: ref}}
|
||||
container[entry] = resp
|
||||
|
||||
case spec.Parameter:
|
||||
if err := rewriteParentRef(sp, "#"+parent, ref); err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
case map[string]spec.Parameter:
|
||||
param := container[entry]
|
||||
param.Schema = &spec.Schema{SchemaProps: spec.SchemaProps{Ref: ref}}
|
||||
container[entry] = param
|
||||
|
||||
case []spec.Parameter:
|
||||
idx, err := strconv.Atoi(entry)
|
||||
if err != nil {
|
||||
return fmt.Errorf("%s not a number: %w", key[1:], err)
|
||||
}
|
||||
param := container[idx]
|
||||
param.Schema = &spec.Schema{SchemaProps: spec.SchemaProps{Ref: ref}}
|
||||
container[idx] = param
|
||||
|
||||
case spec.Definitions:
|
||||
container[entry] = spec.Schema{SchemaProps: spec.SchemaProps{Ref: ref}}
|
||||
|
||||
case map[string]spec.Schema:
|
||||
container[entry] = spec.Schema{SchemaProps: spec.SchemaProps{Ref: ref}}
|
||||
|
||||
case []spec.Schema:
|
||||
idx, err := strconv.Atoi(entry)
|
||||
if err != nil {
|
||||
return fmt.Errorf("%s not a number: %w", key[1:], err)
|
||||
}
|
||||
container[idx] = spec.Schema{SchemaProps: spec.SchemaProps{Ref: ref}}
|
||||
|
||||
case *spec.SchemaOrArray:
|
||||
// NOTE: this is necessarily an array - otherwise, the parent would be *Schema
|
||||
idx, err := strconv.Atoi(entry)
|
||||
if err != nil {
|
||||
return fmt.Errorf("%s not a number: %w", key[1:], err)
|
||||
}
|
||||
container.Schemas[idx] = spec.Schema{SchemaProps: spec.SchemaProps{Ref: ref}}
|
||||
|
||||
case spec.SchemaProperties:
|
||||
container[entry] = spec.Schema{SchemaProps: spec.SchemaProps{Ref: ref}}
|
||||
|
||||
// NOTE: can't have case *spec.SchemaOrBool = parent in this case is *Schema
|
||||
|
||||
default:
|
||||
return fmt.Errorf("unhandled parent schema rewrite %s (%T)", key, pvalue)
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
// getPointerFromKey retrieves the content of the JSON pointer "key"
|
||||
func getPointerFromKey(sp interface{}, key string) (string, interface{}, error) {
|
||||
switch sp.(type) {
|
||||
case *spec.Schema:
|
||||
case *spec.Swagger:
|
||||
default:
|
||||
panic("unexpected type used in getPointerFromKey")
|
||||
}
|
||||
if key == "#/" {
|
||||
return "", sp, nil
|
||||
}
|
||||
// unescape chars in key, e.g. "{}" from path params
|
||||
pth, _ := url.PathUnescape(key[1:])
|
||||
ptr, err := jsonpointer.New(pth)
|
||||
if err != nil {
|
||||
return "", nil, err
|
||||
}
|
||||
|
||||
value, _, err := ptr.Get(sp)
|
||||
if err != nil {
|
||||
debugLog("error when getting key: %s with path: %s", key, pth)
|
||||
|
||||
return "", nil, err
|
||||
}
|
||||
|
||||
return pth, value, nil
|
||||
}
|
||||
|
||||
// getParentFromKey retrieves the container of the JSON pointer "key"
|
||||
func getParentFromKey(sp interface{}, key string) (string, string, interface{}, error) {
|
||||
switch sp.(type) {
|
||||
case *spec.Schema:
|
||||
case *spec.Swagger:
|
||||
default:
|
||||
panic("unexpected type used in getPointerFromKey")
|
||||
}
|
||||
// unescape chars in key, e.g. "{}" from path params
|
||||
pth, _ := url.PathUnescape(key[1:])
|
||||
|
||||
parent, entry := path.Dir(pth), path.Base(pth)
|
||||
debugLog("getting schema holder at: %s, with entry: %s", parent, entry)
|
||||
|
||||
pptr, err := jsonpointer.New(parent)
|
||||
if err != nil {
|
||||
return "", "", nil, err
|
||||
}
|
||||
pvalue, _, err := pptr.Get(sp)
|
||||
if err != nil {
|
||||
return "", "", nil, fmt.Errorf("can't get parent for %s: %w", parent, err)
|
||||
}
|
||||
|
||||
return parent, entry, pvalue, nil
|
||||
}
|
||||
|
||||
// UpdateRef replaces a ref by another one
|
||||
func UpdateRef(sp interface{}, key string, ref spec.Ref) error {
|
||||
switch sp.(type) {
|
||||
case *spec.Schema:
|
||||
case *spec.Swagger:
|
||||
default:
|
||||
panic("unexpected type used in getPointerFromKey")
|
||||
}
|
||||
debugLog("updating ref for %s with %s", key, ref.String())
|
||||
pth, value, err := getPointerFromKey(sp, key)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
switch refable := value.(type) {
|
||||
case *spec.Schema:
|
||||
refable.Ref = ref
|
||||
case *spec.SchemaOrArray:
|
||||
if refable.Schema != nil {
|
||||
refable.Schema.Ref = ref
|
||||
}
|
||||
case *spec.SchemaOrBool:
|
||||
if refable.Schema != nil {
|
||||
refable.Schema.Ref = ref
|
||||
}
|
||||
case spec.Schema:
|
||||
debugLog("rewriting holder for %T", refable)
|
||||
_, entry, pvalue, erp := getParentFromKey(sp, key)
|
||||
if erp != nil {
|
||||
return err
|
||||
}
|
||||
switch container := pvalue.(type) {
|
||||
case spec.Definitions:
|
||||
container[entry] = spec.Schema{SchemaProps: spec.SchemaProps{Ref: ref}}
|
||||
|
||||
case map[string]spec.Schema:
|
||||
container[entry] = spec.Schema{SchemaProps: spec.SchemaProps{Ref: ref}}
|
||||
|
||||
case []spec.Schema:
|
||||
idx, err := strconv.Atoi(entry)
|
||||
if err != nil {
|
||||
return fmt.Errorf("%s not a number: %w", pth, err)
|
||||
}
|
||||
container[idx] = spec.Schema{SchemaProps: spec.SchemaProps{Ref: ref}}
|
||||
|
||||
case *spec.SchemaOrArray:
|
||||
// NOTE: this is necessarily an array - otherwise, the parent would be *Schema
|
||||
idx, err := strconv.Atoi(entry)
|
||||
if err != nil {
|
||||
return fmt.Errorf("%s not a number: %w", pth, err)
|
||||
}
|
||||
container.Schemas[idx] = spec.Schema{SchemaProps: spec.SchemaProps{Ref: ref}}
|
||||
|
||||
case spec.SchemaProperties:
|
||||
container[entry] = spec.Schema{SchemaProps: spec.SchemaProps{Ref: ref}}
|
||||
|
||||
// NOTE: can't have case *spec.SchemaOrBool = parent in this case is *Schema
|
||||
|
||||
default:
|
||||
return fmt.Errorf("unhandled container type at %s: %T", key, value)
|
||||
}
|
||||
|
||||
default:
|
||||
return fmt.Errorf("no schema with ref found at %s for %T", key, value)
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
// UpdateRefWithSchema replaces a ref with a schema (i.e. re-inline schema)
|
||||
func UpdateRefWithSchema(sp *spec.Swagger, key string, sch *spec.Schema) error {
|
||||
debugLog("updating ref for %s with schema", key)
|
||||
pth, value, err := getPointerFromKey(sp, key)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
switch refable := value.(type) {
|
||||
case *spec.Schema:
|
||||
*refable = *sch
|
||||
case spec.Schema:
|
||||
_, entry, pvalue, erp := getParentFromKey(sp, key)
|
||||
if erp != nil {
|
||||
return err
|
||||
}
|
||||
switch container := pvalue.(type) {
|
||||
case spec.Definitions:
|
||||
container[entry] = *sch
|
||||
|
||||
case map[string]spec.Schema:
|
||||
container[entry] = *sch
|
||||
|
||||
case []spec.Schema:
|
||||
idx, err := strconv.Atoi(entry)
|
||||
if err != nil {
|
||||
return fmt.Errorf("%s not a number: %w", pth, err)
|
||||
}
|
||||
container[idx] = *sch
|
||||
|
||||
case *spec.SchemaOrArray:
|
||||
// NOTE: this is necessarily an array - otherwise, the parent would be *Schema
|
||||
idx, err := strconv.Atoi(entry)
|
||||
if err != nil {
|
||||
return fmt.Errorf("%s not a number: %w", pth, err)
|
||||
}
|
||||
container.Schemas[idx] = *sch
|
||||
|
||||
case spec.SchemaProperties:
|
||||
container[entry] = *sch
|
||||
|
||||
// NOTE: can't have case *spec.SchemaOrBool = parent in this case is *Schema
|
||||
|
||||
default:
|
||||
return fmt.Errorf("unhandled type for parent of [%s]: %T", key, value)
|
||||
}
|
||||
case *spec.SchemaOrArray:
|
||||
*refable.Schema = *sch
|
||||
// NOTE: can't have case *spec.SchemaOrBool = parent in this case is *Schema
|
||||
case *spec.SchemaOrBool:
|
||||
*refable.Schema = *sch
|
||||
default:
|
||||
return fmt.Errorf("no schema with ref found at %s for %T", key, value)
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
// DeepestRefResult holds the results from DeepestRef analysis
|
||||
type DeepestRefResult struct {
|
||||
Ref spec.Ref
|
||||
Schema *spec.Schema
|
||||
Warnings []string
|
||||
}
|
||||
|
||||
// DeepestRef finds the first definition ref, from a cascade of nested refs which are not definitions.
|
||||
// - if no definition is found, returns the deepest ref.
|
||||
// - pointers to external files are expanded
|
||||
//
|
||||
// NOTE: all external $ref's are assumed to be already expanded at this stage.
|
||||
func DeepestRef(sp *spec.Swagger, opts *spec.ExpandOptions, ref spec.Ref) (*DeepestRefResult, error) {
|
||||
if !ref.HasFragmentOnly {
|
||||
// we found an external $ref, which is odd at this stage:
|
||||
// do nothing on external $refs
|
||||
return &DeepestRefResult{Ref: ref}, nil
|
||||
}
|
||||
|
||||
currentRef := ref
|
||||
visited := make(map[string]bool, 64)
|
||||
warnings := make([]string, 0, 2)
|
||||
|
||||
DOWNREF:
|
||||
for currentRef.String() != "" {
|
||||
if path.Dir(currentRef.String()) == definitionsPath {
|
||||
// this is a top-level definition: stop here and return this ref
|
||||
return &DeepestRefResult{Ref: currentRef}, nil
|
||||
}
|
||||
|
||||
if _, beenThere := visited[currentRef.String()]; beenThere {
|
||||
return nil,
|
||||
fmt.Errorf("cannot resolve cyclic chain of pointers under %s", currentRef.String())
|
||||
}
|
||||
|
||||
visited[currentRef.String()] = true
|
||||
value, _, err := currentRef.GetPointer().Get(sp)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
switch refable := value.(type) {
|
||||
case *spec.Schema:
|
||||
if refable.Ref.String() == "" {
|
||||
break DOWNREF
|
||||
}
|
||||
currentRef = refable.Ref
|
||||
|
||||
case spec.Schema:
|
||||
if refable.Ref.String() == "" {
|
||||
break DOWNREF
|
||||
}
|
||||
currentRef = refable.Ref
|
||||
|
||||
case *spec.SchemaOrArray:
|
||||
if refable.Schema == nil || refable.Schema != nil && refable.Schema.Ref.String() == "" {
|
||||
break DOWNREF
|
||||
}
|
||||
currentRef = refable.Schema.Ref
|
||||
|
||||
case *spec.SchemaOrBool:
|
||||
if refable.Schema == nil || refable.Schema != nil && refable.Schema.Ref.String() == "" {
|
||||
break DOWNREF
|
||||
}
|
||||
currentRef = refable.Schema.Ref
|
||||
|
||||
case spec.Response:
|
||||
// a pointer points to a schema initially marshalled in responses section...
|
||||
// Attempt to convert this to a schema. If this fails, the spec is invalid
|
||||
asJSON, _ := refable.MarshalJSON()
|
||||
var asSchema spec.Schema
|
||||
|
||||
err := asSchema.UnmarshalJSON(asJSON)
|
||||
if err != nil {
|
||||
return nil,
|
||||
fmt.Errorf("invalid type for resolved JSON pointer %s. Expected a schema a, got: %T",
|
||||
currentRef.String(), value)
|
||||
}
|
||||
warnings = append(warnings, fmt.Sprintf("found $ref %q (response) interpreted as schema", currentRef.String()))
|
||||
|
||||
if asSchema.Ref.String() == "" {
|
||||
break DOWNREF
|
||||
}
|
||||
currentRef = asSchema.Ref
|
||||
|
||||
case spec.Parameter:
|
||||
// a pointer points to a schema initially marshalled in parameters section...
|
||||
// Attempt to convert this to a schema. If this fails, the spec is invalid
|
||||
asJSON, _ := refable.MarshalJSON()
|
||||
var asSchema spec.Schema
|
||||
if err := asSchema.UnmarshalJSON(asJSON); err != nil {
|
||||
return nil,
|
||||
fmt.Errorf("invalid type for resolved JSON pointer %s. Expected a schema a, got: %T",
|
||||
currentRef.String(), value)
|
||||
}
|
||||
|
||||
warnings = append(warnings, fmt.Sprintf("found $ref %q (parameter) interpreted as schema", currentRef.String()))
|
||||
|
||||
if asSchema.Ref.String() == "" {
|
||||
break DOWNREF
|
||||
}
|
||||
currentRef = asSchema.Ref
|
||||
|
||||
default:
|
||||
return nil,
|
||||
fmt.Errorf("unhandled type to resolve JSON pointer %s. Expected a Schema, got: %T",
|
||||
currentRef.String(), value)
|
||||
}
|
||||
}
|
||||
|
||||
// assess what schema we're ending with
|
||||
sch, erv := spec.ResolveRefWithBase(sp, ¤tRef, opts)
|
||||
if erv != nil {
|
||||
return nil, erv
|
||||
}
|
||||
|
||||
if sch == nil {
|
||||
return nil, fmt.Errorf("no schema found at %s", currentRef.String())
|
||||
}
|
||||
|
||||
return &DeepestRefResult{Ref: currentRef, Schema: sch, Warnings: warnings}, nil
|
||||
}
|
||||
29
vendor/github.com/go-openapi/analysis/internal/flatten/schutils/flatten_schema.go
generated
vendored
Normal file
29
vendor/github.com/go-openapi/analysis/internal/flatten/schutils/flatten_schema.go
generated
vendored
Normal file
|
|
@ -0,0 +1,29 @@
|
|||
// Package schutils provides tools to save or clone a schema
|
||||
// when flattening a spec.
|
||||
package schutils
|
||||
|
||||
import (
|
||||
"github.com/go-openapi/spec"
|
||||
"github.com/go-openapi/swag"
|
||||
)
|
||||
|
||||
// Save registers a schema as an entry in spec #/definitions
|
||||
func Save(sp *spec.Swagger, name string, schema *spec.Schema) {
|
||||
if schema == nil {
|
||||
return
|
||||
}
|
||||
|
||||
if sp.Definitions == nil {
|
||||
sp.Definitions = make(map[string]spec.Schema, 150)
|
||||
}
|
||||
|
||||
sp.Definitions[name] = *schema
|
||||
}
|
||||
|
||||
// Clone deep-clones a schema
|
||||
func Clone(schema *spec.Schema) *spec.Schema {
|
||||
var sch spec.Schema
|
||||
_ = swag.FromDynamicJSON(schema, &sch)
|
||||
|
||||
return &sch
|
||||
}
|
||||
201
vendor/github.com/go-openapi/analysis/internal/flatten/sortref/keys.go
generated
vendored
Normal file
201
vendor/github.com/go-openapi/analysis/internal/flatten/sortref/keys.go
generated
vendored
Normal file
|
|
@ -0,0 +1,201 @@
|
|||
package sortref
|
||||
|
||||
import (
|
||||
"net/http"
|
||||
"path"
|
||||
"strconv"
|
||||
"strings"
|
||||
|
||||
"github.com/go-openapi/jsonpointer"
|
||||
"github.com/go-openapi/spec"
|
||||
)
|
||||
|
||||
const (
|
||||
paths = "paths"
|
||||
responses = "responses"
|
||||
parameters = "parameters"
|
||||
definitions = "definitions"
|
||||
)
|
||||
|
||||
var (
|
||||
ignoredKeys map[string]struct{}
|
||||
validMethods map[string]struct{}
|
||||
)
|
||||
|
||||
func init() {
|
||||
ignoredKeys = map[string]struct{}{
|
||||
"schema": {},
|
||||
"properties": {},
|
||||
"not": {},
|
||||
"anyOf": {},
|
||||
"oneOf": {},
|
||||
}
|
||||
|
||||
validMethods = map[string]struct{}{
|
||||
"GET": {},
|
||||
"HEAD": {},
|
||||
"OPTIONS": {},
|
||||
"PATCH": {},
|
||||
"POST": {},
|
||||
"PUT": {},
|
||||
"DELETE": {},
|
||||
}
|
||||
}
|
||||
|
||||
// Key represent a key item constructed from /-separated segments
|
||||
type Key struct {
|
||||
Segments int
|
||||
Key string
|
||||
}
|
||||
|
||||
// Keys is a sortable collable collection of Keys
|
||||
type Keys []Key
|
||||
|
||||
func (k Keys) Len() int { return len(k) }
|
||||
func (k Keys) Swap(i, j int) { k[i], k[j] = k[j], k[i] }
|
||||
func (k Keys) Less(i, j int) bool {
|
||||
return k[i].Segments > k[j].Segments || (k[i].Segments == k[j].Segments && k[i].Key < k[j].Key)
|
||||
}
|
||||
|
||||
// KeyParts construct a SplitKey with all its /-separated segments decomposed. It is sortable.
|
||||
func KeyParts(key string) SplitKey {
|
||||
var res []string
|
||||
for _, part := range strings.Split(key[1:], "/") {
|
||||
if part != "" {
|
||||
res = append(res, jsonpointer.Unescape(part))
|
||||
}
|
||||
}
|
||||
|
||||
return res
|
||||
}
|
||||
|
||||
// SplitKey holds of the parts of a /-separated key, soi that their location may be determined.
|
||||
type SplitKey []string
|
||||
|
||||
// IsDefinition is true when the split key is in the #/definitions section of a spec
|
||||
func (s SplitKey) IsDefinition() bool {
|
||||
return len(s) > 1 && s[0] == definitions
|
||||
}
|
||||
|
||||
// DefinitionName yields the name of the definition
|
||||
func (s SplitKey) DefinitionName() string {
|
||||
if !s.IsDefinition() {
|
||||
return ""
|
||||
}
|
||||
|
||||
return s[1]
|
||||
}
|
||||
|
||||
func (s SplitKey) isKeyName(i int) bool {
|
||||
if i <= 0 {
|
||||
return false
|
||||
}
|
||||
|
||||
count := 0
|
||||
for idx := i - 1; idx > 0; idx-- {
|
||||
if s[idx] != "properties" {
|
||||
break
|
||||
}
|
||||
count++
|
||||
}
|
||||
|
||||
return count%2 != 0
|
||||
}
|
||||
|
||||
// PartAdder know how to construct the components of a new name
|
||||
type PartAdder func(string) []string
|
||||
|
||||
// BuildName builds a name from segments
|
||||
func (s SplitKey) BuildName(segments []string, startIndex int, adder PartAdder) string {
|
||||
for i, part := range s[startIndex:] {
|
||||
if _, ignored := ignoredKeys[part]; !ignored || s.isKeyName(startIndex+i) {
|
||||
segments = append(segments, adder(part)...)
|
||||
}
|
||||
}
|
||||
|
||||
return strings.Join(segments, " ")
|
||||
}
|
||||
|
||||
// IsOperation is true when the split key is in the operations section
|
||||
func (s SplitKey) IsOperation() bool {
|
||||
return len(s) > 1 && s[0] == paths
|
||||
}
|
||||
|
||||
// IsSharedOperationParam is true when the split key is in the parameters section of a path
|
||||
func (s SplitKey) IsSharedOperationParam() bool {
|
||||
return len(s) > 2 && s[0] == paths && s[2] == parameters
|
||||
}
|
||||
|
||||
// IsSharedParam is true when the split key is in the #/parameters section of a spec
|
||||
func (s SplitKey) IsSharedParam() bool {
|
||||
return len(s) > 1 && s[0] == parameters
|
||||
}
|
||||
|
||||
// IsOperationParam is true when the split key is in the parameters section of an operation
|
||||
func (s SplitKey) IsOperationParam() bool {
|
||||
return len(s) > 3 && s[0] == paths && s[3] == parameters
|
||||
}
|
||||
|
||||
// IsOperationResponse is true when the split key is in the responses section of an operation
|
||||
func (s SplitKey) IsOperationResponse() bool {
|
||||
return len(s) > 3 && s[0] == paths && s[3] == responses
|
||||
}
|
||||
|
||||
// IsSharedResponse is true when the split key is in the #/responses section of a spec
|
||||
func (s SplitKey) IsSharedResponse() bool {
|
||||
return len(s) > 1 && s[0] == responses
|
||||
}
|
||||
|
||||
// IsDefaultResponse is true when the split key is the default response for an operation
|
||||
func (s SplitKey) IsDefaultResponse() bool {
|
||||
return len(s) > 4 && s[0] == paths && s[3] == responses && s[4] == "default"
|
||||
}
|
||||
|
||||
// IsStatusCodeResponse is true when the split key is an operation response with a status code
|
||||
func (s SplitKey) IsStatusCodeResponse() bool {
|
||||
isInt := func() bool {
|
||||
_, err := strconv.Atoi(s[4])
|
||||
|
||||
return err == nil
|
||||
}
|
||||
|
||||
return len(s) > 4 && s[0] == paths && s[3] == responses && isInt()
|
||||
}
|
||||
|
||||
// ResponseName yields either the status code or "Default" for a response
|
||||
func (s SplitKey) ResponseName() string {
|
||||
if s.IsStatusCodeResponse() {
|
||||
code, _ := strconv.Atoi(s[4])
|
||||
|
||||
return http.StatusText(code)
|
||||
}
|
||||
|
||||
if s.IsDefaultResponse() {
|
||||
return "Default"
|
||||
}
|
||||
|
||||
return ""
|
||||
}
|
||||
|
||||
// PathItemRef constructs a $ref object from a split key of the form /{path}/{method}
|
||||
func (s SplitKey) PathItemRef() spec.Ref {
|
||||
if len(s) < 3 {
|
||||
return spec.Ref{}
|
||||
}
|
||||
|
||||
pth, method := s[1], s[2]
|
||||
if _, isValidMethod := validMethods[strings.ToUpper(method)]; !isValidMethod && !strings.HasPrefix(method, "x-") {
|
||||
return spec.Ref{}
|
||||
}
|
||||
|
||||
return spec.MustCreateRef("#" + path.Join("/", paths, jsonpointer.Escape(pth), strings.ToUpper(method)))
|
||||
}
|
||||
|
||||
// PathRef constructs a $ref object from a split key of the form /paths/{reference}
|
||||
func (s SplitKey) PathRef() spec.Ref {
|
||||
if !s.IsOperation() {
|
||||
return spec.Ref{}
|
||||
}
|
||||
|
||||
return spec.MustCreateRef("#" + path.Join("/", paths, jsonpointer.Escape(s[1])))
|
||||
}
|
||||
141
vendor/github.com/go-openapi/analysis/internal/flatten/sortref/sort_ref.go
generated
vendored
Normal file
141
vendor/github.com/go-openapi/analysis/internal/flatten/sortref/sort_ref.go
generated
vendored
Normal file
|
|
@ -0,0 +1,141 @@
|
|||
package sortref
|
||||
|
||||
import (
|
||||
"reflect"
|
||||
"sort"
|
||||
"strings"
|
||||
|
||||
"github.com/go-openapi/analysis/internal/flatten/normalize"
|
||||
"github.com/go-openapi/spec"
|
||||
)
|
||||
|
||||
var depthGroupOrder = []string{
|
||||
"sharedParam", "sharedResponse", "sharedOpParam", "opParam", "codeResponse", "defaultResponse", "definition",
|
||||
}
|
||||
|
||||
type mapIterator struct {
|
||||
len int
|
||||
mapIter *reflect.MapIter
|
||||
}
|
||||
|
||||
func (i *mapIterator) Next() bool {
|
||||
return i.mapIter.Next()
|
||||
}
|
||||
|
||||
func (i *mapIterator) Len() int {
|
||||
return i.len
|
||||
}
|
||||
|
||||
func (i *mapIterator) Key() string {
|
||||
return i.mapIter.Key().String()
|
||||
}
|
||||
|
||||
func mustMapIterator(anyMap interface{}) *mapIterator {
|
||||
val := reflect.ValueOf(anyMap)
|
||||
|
||||
return &mapIterator{mapIter: val.MapRange(), len: val.Len()}
|
||||
}
|
||||
|
||||
// DepthFirst sorts a map of anything. It groups keys by category
|
||||
// (shared params, op param, statuscode response, default response, definitions)
|
||||
// sort groups internally by number of parts in the key and lexical names
|
||||
// flatten groups into a single list of keys
|
||||
func DepthFirst(in interface{}) []string {
|
||||
iterator := mustMapIterator(in)
|
||||
sorted := make([]string, 0, iterator.Len())
|
||||
grouped := make(map[string]Keys, iterator.Len())
|
||||
|
||||
for iterator.Next() {
|
||||
k := iterator.Key()
|
||||
split := KeyParts(k)
|
||||
var pk string
|
||||
|
||||
if split.IsSharedOperationParam() {
|
||||
pk = "sharedOpParam"
|
||||
}
|
||||
if split.IsOperationParam() {
|
||||
pk = "opParam"
|
||||
}
|
||||
if split.IsStatusCodeResponse() {
|
||||
pk = "codeResponse"
|
||||
}
|
||||
if split.IsDefaultResponse() {
|
||||
pk = "defaultResponse"
|
||||
}
|
||||
if split.IsDefinition() {
|
||||
pk = "definition"
|
||||
}
|
||||
if split.IsSharedParam() {
|
||||
pk = "sharedParam"
|
||||
}
|
||||
if split.IsSharedResponse() {
|
||||
pk = "sharedResponse"
|
||||
}
|
||||
grouped[pk] = append(grouped[pk], Key{Segments: len(split), Key: k})
|
||||
}
|
||||
|
||||
for _, pk := range depthGroupOrder {
|
||||
res := grouped[pk]
|
||||
sort.Sort(res)
|
||||
|
||||
for _, v := range res {
|
||||
sorted = append(sorted, v.Key)
|
||||
}
|
||||
}
|
||||
|
||||
return sorted
|
||||
}
|
||||
|
||||
// topMostRefs is able to sort refs by hierarchical then lexicographic order,
|
||||
// yielding refs ordered breadth-first.
|
||||
type topmostRefs []string
|
||||
|
||||
func (k topmostRefs) Len() int { return len(k) }
|
||||
func (k topmostRefs) Swap(i, j int) { k[i], k[j] = k[j], k[i] }
|
||||
func (k topmostRefs) Less(i, j int) bool {
|
||||
li, lj := len(strings.Split(k[i], "/")), len(strings.Split(k[j], "/"))
|
||||
if li == lj {
|
||||
return k[i] < k[j]
|
||||
}
|
||||
|
||||
return li < lj
|
||||
}
|
||||
|
||||
// TopmostFirst sorts references by depth
|
||||
func TopmostFirst(refs []string) []string {
|
||||
res := topmostRefs(refs)
|
||||
sort.Sort(res)
|
||||
|
||||
return res
|
||||
}
|
||||
|
||||
// RefRevIdx is a reverse index for references
|
||||
type RefRevIdx struct {
|
||||
Ref spec.Ref
|
||||
Keys []string
|
||||
}
|
||||
|
||||
// ReverseIndex builds a reverse index for references in schemas
|
||||
func ReverseIndex(schemas map[string]spec.Ref, basePath string) map[string]RefRevIdx {
|
||||
collected := make(map[string]RefRevIdx)
|
||||
for key, schRef := range schemas {
|
||||
// normalize paths before sorting,
|
||||
// so we get together keys that are from the same external file
|
||||
normalizedPath := normalize.Path(schRef, basePath)
|
||||
|
||||
entry, ok := collected[normalizedPath]
|
||||
if ok {
|
||||
entry.Keys = append(entry.Keys, key)
|
||||
collected[normalizedPath] = entry
|
||||
|
||||
continue
|
||||
}
|
||||
|
||||
collected[normalizedPath] = RefRevIdx{
|
||||
Ref: schRef,
|
||||
Keys: []string{key},
|
||||
}
|
||||
}
|
||||
|
||||
return collected
|
||||
}
|
||||
515
vendor/github.com/go-openapi/analysis/mixin.go
generated
vendored
Normal file
515
vendor/github.com/go-openapi/analysis/mixin.go
generated
vendored
Normal file
|
|
@ -0,0 +1,515 @@
|
|||
// Copyright 2015 go-swagger maintainers
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 (the "License");
|
||||
// you may not use this file except in compliance with the License.
|
||||
// You may obtain a copy of the License at
|
||||
//
|
||||
// http://www.apache.org/licenses/LICENSE-2.0
|
||||
//
|
||||
// Unless required by applicable law or agreed to in writing, software
|
||||
// distributed under the License is distributed on an "AS IS" BASIS,
|
||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
// See the License for the specific language governing permissions and
|
||||
// limitations under the License.
|
||||
|
||||
package analysis
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"reflect"
|
||||
|
||||
"github.com/go-openapi/spec"
|
||||
)
|
||||
|
||||
// Mixin modifies the primary swagger spec by adding the paths and
|
||||
// definitions from the mixin specs. Top level parameters and
|
||||
// responses from the mixins are also carried over. Operation id
|
||||
// collisions are avoided by appending "Mixin<N>" but only if
|
||||
// needed.
|
||||
//
|
||||
// The following parts of primary are subject to merge, filling empty details
|
||||
// - Info
|
||||
// - BasePath
|
||||
// - Host
|
||||
// - ExternalDocs
|
||||
//
|
||||
// Consider calling FixEmptyResponseDescriptions() on the modified primary
|
||||
// if you read them from storage and they are valid to start with.
|
||||
//
|
||||
// Entries in "paths", "definitions", "parameters" and "responses" are
|
||||
// added to the primary in the order of the given mixins. If the entry
|
||||
// already exists in primary it is skipped with a warning message.
|
||||
//
|
||||
// The count of skipped entries (from collisions) is returned so any
|
||||
// deviation from the number expected can flag a warning in your build
|
||||
// scripts. Carefully review the collisions before accepting them;
|
||||
// consider renaming things if possible.
|
||||
//
|
||||
// No key normalization takes place (paths, type defs,
|
||||
// etc). Ensure they are canonical if your downstream tools do
|
||||
// key normalization of any form.
|
||||
//
|
||||
// Merging schemes (http, https), and consumers/producers do not account for
|
||||
// collisions.
|
||||
func Mixin(primary *spec.Swagger, mixins ...*spec.Swagger) []string {
|
||||
skipped := make([]string, 0, len(mixins))
|
||||
opIds := getOpIds(primary)
|
||||
initPrimary(primary)
|
||||
|
||||
for i, m := range mixins {
|
||||
skipped = append(skipped, mergeSwaggerProps(primary, m)...)
|
||||
|
||||
skipped = append(skipped, mergeConsumes(primary, m)...)
|
||||
|
||||
skipped = append(skipped, mergeProduces(primary, m)...)
|
||||
|
||||
skipped = append(skipped, mergeTags(primary, m)...)
|
||||
|
||||
skipped = append(skipped, mergeSchemes(primary, m)...)
|
||||
|
||||
skipped = append(skipped, mergeSecurityDefinitions(primary, m)...)
|
||||
|
||||
skipped = append(skipped, mergeSecurityRequirements(primary, m)...)
|
||||
|
||||
skipped = append(skipped, mergeDefinitions(primary, m)...)
|
||||
|
||||
// merging paths requires a map of operationIDs to work with
|
||||
skipped = append(skipped, mergePaths(primary, m, opIds, i)...)
|
||||
|
||||
skipped = append(skipped, mergeParameters(primary, m)...)
|
||||
|
||||
skipped = append(skipped, mergeResponses(primary, m)...)
|
||||
}
|
||||
|
||||
return skipped
|
||||
}
|
||||
|
||||
// getOpIds extracts all the paths.<path>.operationIds from the given
|
||||
// spec and returns them as the keys in a map with 'true' values.
|
||||
func getOpIds(s *spec.Swagger) map[string]bool {
|
||||
rv := make(map[string]bool)
|
||||
if s.Paths == nil {
|
||||
return rv
|
||||
}
|
||||
|
||||
for _, v := range s.Paths.Paths {
|
||||
piops := pathItemOps(v)
|
||||
|
||||
for _, op := range piops {
|
||||
rv[op.ID] = true
|
||||
}
|
||||
}
|
||||
|
||||
return rv
|
||||
}
|
||||
|
||||
func pathItemOps(p spec.PathItem) []*spec.Operation {
|
||||
var rv []*spec.Operation
|
||||
rv = appendOp(rv, p.Get)
|
||||
rv = appendOp(rv, p.Put)
|
||||
rv = appendOp(rv, p.Post)
|
||||
rv = appendOp(rv, p.Delete)
|
||||
rv = appendOp(rv, p.Head)
|
||||
rv = appendOp(rv, p.Patch)
|
||||
|
||||
return rv
|
||||
}
|
||||
|
||||
func appendOp(ops []*spec.Operation, op *spec.Operation) []*spec.Operation {
|
||||
if op == nil {
|
||||
return ops
|
||||
}
|
||||
|
||||
return append(ops, op)
|
||||
}
|
||||
|
||||
func mergeSecurityDefinitions(primary *spec.Swagger, m *spec.Swagger) (skipped []string) {
|
||||
for k, v := range m.SecurityDefinitions {
|
||||
if _, exists := primary.SecurityDefinitions[k]; exists {
|
||||
warn := fmt.Sprintf(
|
||||
"SecurityDefinitions entry '%v' already exists in primary or higher priority mixin, skipping\n", k)
|
||||
skipped = append(skipped, warn)
|
||||
|
||||
continue
|
||||
}
|
||||
|
||||
primary.SecurityDefinitions[k] = v
|
||||
}
|
||||
|
||||
return
|
||||
}
|
||||
|
||||
func mergeSecurityRequirements(primary *spec.Swagger, m *spec.Swagger) (skipped []string) {
|
||||
for _, v := range m.Security {
|
||||
found := false
|
||||
for _, vv := range primary.Security {
|
||||
if reflect.DeepEqual(v, vv) {
|
||||
found = true
|
||||
|
||||
break
|
||||
}
|
||||
}
|
||||
|
||||
if found {
|
||||
warn := fmt.Sprintf(
|
||||
"Security requirement: '%v' already exists in primary or higher priority mixin, skipping\n", v)
|
||||
skipped = append(skipped, warn)
|
||||
|
||||
continue
|
||||
}
|
||||
primary.Security = append(primary.Security, v)
|
||||
}
|
||||
|
||||
return
|
||||
}
|
||||
|
||||
func mergeDefinitions(primary *spec.Swagger, m *spec.Swagger) (skipped []string) {
|
||||
for k, v := range m.Definitions {
|
||||
// assume name collisions represent IDENTICAL type. careful.
|
||||
if _, exists := primary.Definitions[k]; exists {
|
||||
warn := fmt.Sprintf(
|
||||
"definitions entry '%v' already exists in primary or higher priority mixin, skipping\n", k)
|
||||
skipped = append(skipped, warn)
|
||||
|
||||
continue
|
||||
}
|
||||
primary.Definitions[k] = v
|
||||
}
|
||||
|
||||
return
|
||||
}
|
||||
|
||||
func mergePaths(primary *spec.Swagger, m *spec.Swagger, opIds map[string]bool, mixIndex int) (skipped []string) {
|
||||
if m.Paths != nil {
|
||||
for k, v := range m.Paths.Paths {
|
||||
if _, exists := primary.Paths.Paths[k]; exists {
|
||||
warn := fmt.Sprintf(
|
||||
"paths entry '%v' already exists in primary or higher priority mixin, skipping\n", k)
|
||||
skipped = append(skipped, warn)
|
||||
|
||||
continue
|
||||
}
|
||||
|
||||
// Swagger requires that operationIds be
|
||||
// unique within a spec. If we find a
|
||||
// collision we append "Mixin0" to the
|
||||
// operatoinId we are adding, where 0 is mixin
|
||||
// index. We assume that operationIds with
|
||||
// all the proivded specs are already unique.
|
||||
piops := pathItemOps(v)
|
||||
for _, piop := range piops {
|
||||
if opIds[piop.ID] {
|
||||
piop.ID = fmt.Sprintf("%v%v%v", piop.ID, "Mixin", mixIndex)
|
||||
}
|
||||
opIds[piop.ID] = true
|
||||
}
|
||||
primary.Paths.Paths[k] = v
|
||||
}
|
||||
}
|
||||
|
||||
return
|
||||
}
|
||||
|
||||
func mergeParameters(primary *spec.Swagger, m *spec.Swagger) (skipped []string) {
|
||||
for k, v := range m.Parameters {
|
||||
// could try to rename on conflict but would
|
||||
// have to fix $refs in the mixin. Complain
|
||||
// for now
|
||||
if _, exists := primary.Parameters[k]; exists {
|
||||
warn := fmt.Sprintf(
|
||||
"top level parameters entry '%v' already exists in primary or higher priority mixin, skipping\n", k)
|
||||
skipped = append(skipped, warn)
|
||||
|
||||
continue
|
||||
}
|
||||
primary.Parameters[k] = v
|
||||
}
|
||||
|
||||
return
|
||||
}
|
||||
|
||||
func mergeResponses(primary *spec.Swagger, m *spec.Swagger) (skipped []string) {
|
||||
for k, v := range m.Responses {
|
||||
// could try to rename on conflict but would
|
||||
// have to fix $refs in the mixin. Complain
|
||||
// for now
|
||||
if _, exists := primary.Responses[k]; exists {
|
||||
warn := fmt.Sprintf(
|
||||
"top level responses entry '%v' already exists in primary or higher priority mixin, skipping\n", k)
|
||||
skipped = append(skipped, warn)
|
||||
|
||||
continue
|
||||
}
|
||||
primary.Responses[k] = v
|
||||
}
|
||||
|
||||
return skipped
|
||||
}
|
||||
|
||||
func mergeConsumes(primary *spec.Swagger, m *spec.Swagger) []string {
|
||||
for _, v := range m.Consumes {
|
||||
found := false
|
||||
for _, vv := range primary.Consumes {
|
||||
if v == vv {
|
||||
found = true
|
||||
|
||||
break
|
||||
}
|
||||
}
|
||||
|
||||
if found {
|
||||
// no warning here: we just skip it
|
||||
continue
|
||||
}
|
||||
primary.Consumes = append(primary.Consumes, v)
|
||||
}
|
||||
|
||||
return []string{}
|
||||
}
|
||||
|
||||
func mergeProduces(primary *spec.Swagger, m *spec.Swagger) []string {
|
||||
for _, v := range m.Produces {
|
||||
found := false
|
||||
for _, vv := range primary.Produces {
|
||||
if v == vv {
|
||||
found = true
|
||||
|
||||
break
|
||||
}
|
||||
}
|
||||
|
||||
if found {
|
||||
// no warning here: we just skip it
|
||||
continue
|
||||
}
|
||||
primary.Produces = append(primary.Produces, v)
|
||||
}
|
||||
|
||||
return []string{}
|
||||
}
|
||||
|
||||
func mergeTags(primary *spec.Swagger, m *spec.Swagger) (skipped []string) {
|
||||
for _, v := range m.Tags {
|
||||
found := false
|
||||
for _, vv := range primary.Tags {
|
||||
if v.Name == vv.Name {
|
||||
found = true
|
||||
|
||||
break
|
||||
}
|
||||
}
|
||||
|
||||
if found {
|
||||
warn := fmt.Sprintf(
|
||||
"top level tags entry with name '%v' already exists in primary or higher priority mixin, skipping\n",
|
||||
v.Name,
|
||||
)
|
||||
skipped = append(skipped, warn)
|
||||
|
||||
continue
|
||||
}
|
||||
|
||||
primary.Tags = append(primary.Tags, v)
|
||||
}
|
||||
|
||||
return
|
||||
}
|
||||
|
||||
func mergeSchemes(primary *spec.Swagger, m *spec.Swagger) []string {
|
||||
for _, v := range m.Schemes {
|
||||
found := false
|
||||
for _, vv := range primary.Schemes {
|
||||
if v == vv {
|
||||
found = true
|
||||
|
||||
break
|
||||
}
|
||||
}
|
||||
|
||||
if found {
|
||||
// no warning here: we just skip it
|
||||
continue
|
||||
}
|
||||
primary.Schemes = append(primary.Schemes, v)
|
||||
}
|
||||
|
||||
return []string{}
|
||||
}
|
||||
|
||||
func mergeSwaggerProps(primary *spec.Swagger, m *spec.Swagger) []string {
|
||||
var skipped, skippedInfo, skippedDocs []string
|
||||
|
||||
primary.Extensions, skipped = mergeExtensions(primary.Extensions, m.Extensions)
|
||||
|
||||
// merging details in swagger top properties
|
||||
if primary.Host == "" {
|
||||
primary.Host = m.Host
|
||||
}
|
||||
|
||||
if primary.BasePath == "" {
|
||||
primary.BasePath = m.BasePath
|
||||
}
|
||||
|
||||
if primary.Info == nil {
|
||||
primary.Info = m.Info
|
||||
} else if m.Info != nil {
|
||||
skippedInfo = mergeInfo(primary.Info, m.Info)
|
||||
skipped = append(skipped, skippedInfo...)
|
||||
}
|
||||
|
||||
if primary.ExternalDocs == nil {
|
||||
primary.ExternalDocs = m.ExternalDocs
|
||||
} else if m != nil {
|
||||
skippedDocs = mergeExternalDocs(primary.ExternalDocs, m.ExternalDocs)
|
||||
skipped = append(skipped, skippedDocs...)
|
||||
}
|
||||
|
||||
return skipped
|
||||
}
|
||||
|
||||
// nolint: unparam
|
||||
func mergeExternalDocs(primary *spec.ExternalDocumentation, m *spec.ExternalDocumentation) []string {
|
||||
if primary.Description == "" {
|
||||
primary.Description = m.Description
|
||||
}
|
||||
|
||||
if primary.URL == "" {
|
||||
primary.URL = m.URL
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
func mergeInfo(primary *spec.Info, m *spec.Info) []string {
|
||||
var sk, skipped []string
|
||||
|
||||
primary.Extensions, sk = mergeExtensions(primary.Extensions, m.Extensions)
|
||||
skipped = append(skipped, sk...)
|
||||
|
||||
if primary.Description == "" {
|
||||
primary.Description = m.Description
|
||||
}
|
||||
|
||||
if primary.Title == "" {
|
||||
primary.Description = m.Description
|
||||
}
|
||||
|
||||
if primary.TermsOfService == "" {
|
||||
primary.TermsOfService = m.TermsOfService
|
||||
}
|
||||
|
||||
if primary.Version == "" {
|
||||
primary.Version = m.Version
|
||||
}
|
||||
|
||||
if primary.Contact == nil {
|
||||
primary.Contact = m.Contact
|
||||
} else if m.Contact != nil {
|
||||
var csk []string
|
||||
primary.Contact.Extensions, csk = mergeExtensions(primary.Contact.Extensions, m.Contact.Extensions)
|
||||
skipped = append(skipped, csk...)
|
||||
|
||||
if primary.Contact.Name == "" {
|
||||
primary.Contact.Name = m.Contact.Name
|
||||
}
|
||||
|
||||
if primary.Contact.URL == "" {
|
||||
primary.Contact.URL = m.Contact.URL
|
||||
}
|
||||
|
||||
if primary.Contact.Email == "" {
|
||||
primary.Contact.Email = m.Contact.Email
|
||||
}
|
||||
}
|
||||
|
||||
if primary.License == nil {
|
||||
primary.License = m.License
|
||||
} else if m.License != nil {
|
||||
var lsk []string
|
||||
primary.License.Extensions, lsk = mergeExtensions(primary.License.Extensions, m.License.Extensions)
|
||||
skipped = append(skipped, lsk...)
|
||||
|
||||
if primary.License.Name == "" {
|
||||
primary.License.Name = m.License.Name
|
||||
}
|
||||
|
||||
if primary.License.URL == "" {
|
||||
primary.License.URL = m.License.URL
|
||||
}
|
||||
}
|
||||
|
||||
return skipped
|
||||
}
|
||||
|
||||
func mergeExtensions(primary spec.Extensions, m spec.Extensions) (result spec.Extensions, skipped []string) {
|
||||
if primary == nil {
|
||||
result = m
|
||||
|
||||
return
|
||||
}
|
||||
|
||||
if m == nil {
|
||||
result = primary
|
||||
|
||||
return
|
||||
}
|
||||
|
||||
result = primary
|
||||
for k, v := range m {
|
||||
if _, found := primary[k]; found {
|
||||
skipped = append(skipped, k)
|
||||
|
||||
continue
|
||||
}
|
||||
|
||||
primary[k] = v
|
||||
}
|
||||
|
||||
return
|
||||
}
|
||||
|
||||
func initPrimary(primary *spec.Swagger) {
|
||||
if primary.SecurityDefinitions == nil {
|
||||
primary.SecurityDefinitions = make(map[string]*spec.SecurityScheme)
|
||||
}
|
||||
|
||||
if primary.Security == nil {
|
||||
primary.Security = make([]map[string][]string, 0, 10)
|
||||
}
|
||||
|
||||
if primary.Produces == nil {
|
||||
primary.Produces = make([]string, 0, 10)
|
||||
}
|
||||
|
||||
if primary.Consumes == nil {
|
||||
primary.Consumes = make([]string, 0, 10)
|
||||
}
|
||||
|
||||
if primary.Tags == nil {
|
||||
primary.Tags = make([]spec.Tag, 0, 10)
|
||||
}
|
||||
|
||||
if primary.Schemes == nil {
|
||||
primary.Schemes = make([]string, 0, 10)
|
||||
}
|
||||
|
||||
if primary.Paths == nil {
|
||||
primary.Paths = &spec.Paths{Paths: make(map[string]spec.PathItem)}
|
||||
}
|
||||
|
||||
if primary.Paths.Paths == nil {
|
||||
primary.Paths.Paths = make(map[string]spec.PathItem)
|
||||
}
|
||||
|
||||
if primary.Definitions == nil {
|
||||
primary.Definitions = make(spec.Definitions)
|
||||
}
|
||||
|
||||
if primary.Parameters == nil {
|
||||
primary.Parameters = make(map[string]spec.Parameter)
|
||||
}
|
||||
|
||||
if primary.Responses == nil {
|
||||
primary.Responses = make(map[string]spec.Response)
|
||||
}
|
||||
}
|
||||
256
vendor/github.com/go-openapi/analysis/schema.go
generated
vendored
Normal file
256
vendor/github.com/go-openapi/analysis/schema.go
generated
vendored
Normal file
|
|
@ -0,0 +1,256 @@
|
|||
package analysis
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
|
||||
"github.com/go-openapi/spec"
|
||||
"github.com/go-openapi/strfmt"
|
||||
)
|
||||
|
||||
// SchemaOpts configures the schema analyzer
|
||||
type SchemaOpts struct {
|
||||
Schema *spec.Schema
|
||||
Root interface{}
|
||||
BasePath string
|
||||
_ struct{}
|
||||
}
|
||||
|
||||
// Schema analysis, will classify the schema according to known
|
||||
// patterns.
|
||||
func Schema(opts SchemaOpts) (*AnalyzedSchema, error) {
|
||||
if opts.Schema == nil {
|
||||
return nil, fmt.Errorf("no schema to analyze")
|
||||
}
|
||||
|
||||
a := &AnalyzedSchema{
|
||||
schema: opts.Schema,
|
||||
root: opts.Root,
|
||||
basePath: opts.BasePath,
|
||||
}
|
||||
|
||||
a.initializeFlags()
|
||||
a.inferKnownType()
|
||||
a.inferEnum()
|
||||
a.inferBaseType()
|
||||
|
||||
if err := a.inferMap(); err != nil {
|
||||
return nil, err
|
||||
}
|
||||
if err := a.inferArray(); err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
a.inferTuple()
|
||||
|
||||
if err := a.inferFromRef(); err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
a.inferSimpleSchema()
|
||||
|
||||
return a, nil
|
||||
}
|
||||
|
||||
// AnalyzedSchema indicates what the schema represents
|
||||
type AnalyzedSchema struct {
|
||||
schema *spec.Schema
|
||||
root interface{}
|
||||
basePath string
|
||||
|
||||
hasProps bool
|
||||
hasAllOf bool
|
||||
hasItems bool
|
||||
hasAdditionalProps bool
|
||||
hasAdditionalItems bool
|
||||
hasRef bool
|
||||
|
||||
IsKnownType bool
|
||||
IsSimpleSchema bool
|
||||
IsArray bool
|
||||
IsSimpleArray bool
|
||||
IsMap bool
|
||||
IsSimpleMap bool
|
||||
IsExtendedObject bool
|
||||
IsTuple bool
|
||||
IsTupleWithExtra bool
|
||||
IsBaseType bool
|
||||
IsEnum bool
|
||||
}
|
||||
|
||||
// Inherits copies value fields from other onto this schema
|
||||
func (a *AnalyzedSchema) inherits(other *AnalyzedSchema) {
|
||||
if other == nil {
|
||||
return
|
||||
}
|
||||
a.hasProps = other.hasProps
|
||||
a.hasAllOf = other.hasAllOf
|
||||
a.hasItems = other.hasItems
|
||||
a.hasAdditionalItems = other.hasAdditionalItems
|
||||
a.hasAdditionalProps = other.hasAdditionalProps
|
||||
a.hasRef = other.hasRef
|
||||
|
||||
a.IsKnownType = other.IsKnownType
|
||||
a.IsSimpleSchema = other.IsSimpleSchema
|
||||
a.IsArray = other.IsArray
|
||||
a.IsSimpleArray = other.IsSimpleArray
|
||||
a.IsMap = other.IsMap
|
||||
a.IsSimpleMap = other.IsSimpleMap
|
||||
a.IsExtendedObject = other.IsExtendedObject
|
||||
a.IsTuple = other.IsTuple
|
||||
a.IsTupleWithExtra = other.IsTupleWithExtra
|
||||
a.IsBaseType = other.IsBaseType
|
||||
a.IsEnum = other.IsEnum
|
||||
}
|
||||
|
||||
func (a *AnalyzedSchema) inferFromRef() error {
|
||||
if a.hasRef {
|
||||
sch := new(spec.Schema)
|
||||
sch.Ref = a.schema.Ref
|
||||
err := spec.ExpandSchema(sch, a.root, nil)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
rsch, err := Schema(SchemaOpts{
|
||||
Schema: sch,
|
||||
Root: a.root,
|
||||
BasePath: a.basePath,
|
||||
})
|
||||
if err != nil {
|
||||
// NOTE(fredbi): currently the only cause for errors is
|
||||
// unresolved ref. Since spec.ExpandSchema() expands the
|
||||
// schema recursively, there is no chance to get there,
|
||||
// until we add more causes for error in this schema analysis.
|
||||
return err
|
||||
}
|
||||
a.inherits(rsch)
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
func (a *AnalyzedSchema) inferSimpleSchema() {
|
||||
a.IsSimpleSchema = a.IsKnownType || a.IsSimpleArray || a.IsSimpleMap
|
||||
}
|
||||
|
||||
func (a *AnalyzedSchema) inferKnownType() {
|
||||
tpe := a.schema.Type
|
||||
format := a.schema.Format
|
||||
a.IsKnownType = tpe.Contains("boolean") ||
|
||||
tpe.Contains("integer") ||
|
||||
tpe.Contains("number") ||
|
||||
tpe.Contains("string") ||
|
||||
(format != "" && strfmt.Default.ContainsName(format)) ||
|
||||
(a.isObjectType() && !a.hasProps && !a.hasAllOf && !a.hasAdditionalProps && !a.hasAdditionalItems)
|
||||
}
|
||||
|
||||
func (a *AnalyzedSchema) inferMap() error {
|
||||
if !a.isObjectType() {
|
||||
return nil
|
||||
}
|
||||
|
||||
hasExtra := a.hasProps || a.hasAllOf
|
||||
a.IsMap = a.hasAdditionalProps && !hasExtra
|
||||
a.IsExtendedObject = a.hasAdditionalProps && hasExtra
|
||||
|
||||
if !a.IsMap {
|
||||
return nil
|
||||
}
|
||||
|
||||
// maps
|
||||
if a.schema.AdditionalProperties.Schema != nil {
|
||||
msch, err := Schema(SchemaOpts{
|
||||
Schema: a.schema.AdditionalProperties.Schema,
|
||||
Root: a.root,
|
||||
BasePath: a.basePath,
|
||||
})
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
a.IsSimpleMap = msch.IsSimpleSchema
|
||||
} else if a.schema.AdditionalProperties.Allows {
|
||||
a.IsSimpleMap = true
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
func (a *AnalyzedSchema) inferArray() error {
|
||||
// an array has Items defined as an object schema, otherwise we qualify this JSON array as a tuple
|
||||
// (yes, even if the Items array contains only one element).
|
||||
// arrays in JSON schema may be unrestricted (i.e no Items specified).
|
||||
// Note that arrays in Swagger MUST have Items. Nonetheless, we analyze unrestricted arrays.
|
||||
//
|
||||
// NOTE: the spec package misses the distinction between:
|
||||
// items: [] and items: {}, so we consider both arrays here.
|
||||
a.IsArray = a.isArrayType() && (a.schema.Items == nil || a.schema.Items.Schemas == nil)
|
||||
if a.IsArray && a.hasItems {
|
||||
if a.schema.Items.Schema != nil {
|
||||
itsch, err := Schema(SchemaOpts{
|
||||
Schema: a.schema.Items.Schema,
|
||||
Root: a.root,
|
||||
BasePath: a.basePath,
|
||||
})
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
a.IsSimpleArray = itsch.IsSimpleSchema
|
||||
}
|
||||
}
|
||||
|
||||
if a.IsArray && !a.hasItems {
|
||||
a.IsSimpleArray = true
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
func (a *AnalyzedSchema) inferTuple() {
|
||||
tuple := a.hasItems && a.schema.Items.Schemas != nil
|
||||
a.IsTuple = tuple && !a.hasAdditionalItems
|
||||
a.IsTupleWithExtra = tuple && a.hasAdditionalItems
|
||||
}
|
||||
|
||||
func (a *AnalyzedSchema) inferBaseType() {
|
||||
if a.isObjectType() {
|
||||
a.IsBaseType = a.schema.Discriminator != ""
|
||||
}
|
||||
}
|
||||
|
||||
func (a *AnalyzedSchema) inferEnum() {
|
||||
a.IsEnum = len(a.schema.Enum) > 0
|
||||
}
|
||||
|
||||
func (a *AnalyzedSchema) initializeFlags() {
|
||||
a.hasProps = len(a.schema.Properties) > 0
|
||||
a.hasAllOf = len(a.schema.AllOf) > 0
|
||||
a.hasRef = a.schema.Ref.String() != ""
|
||||
|
||||
a.hasItems = a.schema.Items != nil &&
|
||||
(a.schema.Items.Schema != nil || len(a.schema.Items.Schemas) > 0)
|
||||
|
||||
a.hasAdditionalProps = a.schema.AdditionalProperties != nil &&
|
||||
(a.schema.AdditionalProperties.Schema != nil || a.schema.AdditionalProperties.Allows)
|
||||
|
||||
a.hasAdditionalItems = a.schema.AdditionalItems != nil &&
|
||||
(a.schema.AdditionalItems.Schema != nil || a.schema.AdditionalItems.Allows)
|
||||
}
|
||||
|
||||
func (a *AnalyzedSchema) isObjectType() bool {
|
||||
return !a.hasRef && (a.schema.Type == nil || a.schema.Type.Contains("") || a.schema.Type.Contains("object"))
|
||||
}
|
||||
|
||||
func (a *AnalyzedSchema) isArrayType() bool {
|
||||
return !a.hasRef && (a.schema.Type != nil && a.schema.Type.Contains("array"))
|
||||
}
|
||||
|
||||
// isAnalyzedAsComplex determines if an analyzed schema is eligible to flattening (i.e. it is "complex").
|
||||
//
|
||||
// Complex means the schema is any of:
|
||||
// - a simple type (primitive)
|
||||
// - an array of something (items are possibly complex ; if this is the case, items will generate a definition)
|
||||
// - a map of something (additionalProperties are possibly complex ; if this is the case, additionalProperties will
|
||||
// generate a definition)
|
||||
func (a *AnalyzedSchema) isAnalyzedAsComplex() bool {
|
||||
return !a.IsSimpleSchema && !a.IsArray && !a.IsMap
|
||||
}
|
||||
1
vendor/github.com/go-openapi/errors/.gitattributes
generated
vendored
Normal file
1
vendor/github.com/go-openapi/errors/.gitattributes
generated
vendored
Normal file
|
|
@ -0,0 +1 @@
|
|||
*.go text eol=lf
|
||||
2
vendor/github.com/go-openapi/errors/.gitignore
generated
vendored
Normal file
2
vendor/github.com/go-openapi/errors/.gitignore
generated
vendored
Normal file
|
|
@ -0,0 +1,2 @@
|
|||
secrets.yml
|
||||
coverage.out
|
||||
48
vendor/github.com/go-openapi/errors/.golangci.yml
generated
vendored
Normal file
48
vendor/github.com/go-openapi/errors/.golangci.yml
generated
vendored
Normal file
|
|
@ -0,0 +1,48 @@
|
|||
linters-settings:
|
||||
govet:
|
||||
check-shadowing: true
|
||||
golint:
|
||||
min-confidence: 0
|
||||
gocyclo:
|
||||
min-complexity: 30
|
||||
maligned:
|
||||
suggest-new: true
|
||||
dupl:
|
||||
threshold: 100
|
||||
goconst:
|
||||
min-len: 2
|
||||
min-occurrences: 4
|
||||
linters:
|
||||
enable-all: true
|
||||
disable:
|
||||
- maligned
|
||||
- lll
|
||||
- gochecknoglobals
|
||||
- godox
|
||||
- gocognit
|
||||
- whitespace
|
||||
- wsl
|
||||
- funlen
|
||||
- gochecknoglobals
|
||||
- gochecknoinits
|
||||
- scopelint
|
||||
- wrapcheck
|
||||
- exhaustivestruct
|
||||
- exhaustive
|
||||
- nlreturn
|
||||
- testpackage
|
||||
- gci
|
||||
- gofumpt
|
||||
- goerr113
|
||||
- gomnd
|
||||
- tparallel
|
||||
- nestif
|
||||
- godot
|
||||
- errorlint
|
||||
- paralleltest
|
||||
- tparallel
|
||||
- cyclop
|
||||
- errname
|
||||
- varnamelen
|
||||
- exhaustruct
|
||||
- maintidx
|
||||
74
vendor/github.com/go-openapi/errors/CODE_OF_CONDUCT.md
generated
vendored
Normal file
74
vendor/github.com/go-openapi/errors/CODE_OF_CONDUCT.md
generated
vendored
Normal file
|
|
@ -0,0 +1,74 @@
|
|||
# Contributor Covenant Code of Conduct
|
||||
|
||||
## Our Pledge
|
||||
|
||||
In the interest of fostering an open and welcoming environment, we as
|
||||
contributors and maintainers pledge to making participation in our project and
|
||||
our community a harassment-free experience for everyone, regardless of age, body
|
||||
size, disability, ethnicity, gender identity and expression, level of experience,
|
||||
nationality, personal appearance, race, religion, or sexual identity and
|
||||
orientation.
|
||||
|
||||
## Our Standards
|
||||
|
||||
Examples of behavior that contributes to creating a positive environment
|
||||
include:
|
||||
|
||||
* Using welcoming and inclusive language
|
||||
* Being respectful of differing viewpoints and experiences
|
||||
* Gracefully accepting constructive criticism
|
||||
* Focusing on what is best for the community
|
||||
* Showing empathy towards other community members
|
||||
|
||||
Examples of unacceptable behavior by participants include:
|
||||
|
||||
* The use of sexualized language or imagery and unwelcome sexual attention or
|
||||
advances
|
||||
* Trolling, insulting/derogatory comments, and personal or political attacks
|
||||
* Public or private harassment
|
||||
* Publishing others' private information, such as a physical or electronic
|
||||
address, without explicit permission
|
||||
* Other conduct which could reasonably be considered inappropriate in a
|
||||
professional setting
|
||||
|
||||
## Our Responsibilities
|
||||
|
||||
Project maintainers are responsible for clarifying the standards of acceptable
|
||||
behavior and are expected to take appropriate and fair corrective action in
|
||||
response to any instances of unacceptable behavior.
|
||||
|
||||
Project maintainers have the right and responsibility to remove, edit, or
|
||||
reject comments, commits, code, wiki edits, issues, and other contributions
|
||||
that are not aligned to this Code of Conduct, or to ban temporarily or
|
||||
permanently any contributor for other behaviors that they deem inappropriate,
|
||||
threatening, offensive, or harmful.
|
||||
|
||||
## Scope
|
||||
|
||||
This Code of Conduct applies both within project spaces and in public spaces
|
||||
when an individual is representing the project or its community. Examples of
|
||||
representing a project or community include using an official project e-mail
|
||||
address, posting via an official social media account, or acting as an appointed
|
||||
representative at an online or offline event. Representation of a project may be
|
||||
further defined and clarified by project maintainers.
|
||||
|
||||
## Enforcement
|
||||
|
||||
Instances of abusive, harassing, or otherwise unacceptable behavior may be
|
||||
reported by contacting the project team at ivan+abuse@flanders.co.nz. All
|
||||
complaints will be reviewed and investigated and will result in a response that
|
||||
is deemed necessary and appropriate to the circumstances. The project team is
|
||||
obligated to maintain confidentiality with regard to the reporter of an incident.
|
||||
Further details of specific enforcement policies may be posted separately.
|
||||
|
||||
Project maintainers who do not follow or enforce the Code of Conduct in good
|
||||
faith may face temporary or permanent repercussions as determined by other
|
||||
members of the project's leadership.
|
||||
|
||||
## Attribution
|
||||
|
||||
This Code of Conduct is adapted from the [Contributor Covenant][homepage], version 1.4,
|
||||
available at [http://contributor-covenant.org/version/1/4][version]
|
||||
|
||||
[homepage]: http://contributor-covenant.org
|
||||
[version]: http://contributor-covenant.org/version/1/4/
|
||||
202
vendor/github.com/go-openapi/errors/LICENSE
generated
vendored
Normal file
202
vendor/github.com/go-openapi/errors/LICENSE
generated
vendored
Normal file
|
|
@ -0,0 +1,202 @@
|
|||
|
||||
Apache License
|
||||
Version 2.0, January 2004
|
||||
http://www.apache.org/licenses/
|
||||
|
||||
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
|
||||
|
||||
1. Definitions.
|
||||
|
||||
"License" shall mean the terms and conditions for use, reproduction,
|
||||
and distribution as defined by Sections 1 through 9 of this document.
|
||||
|
||||
"Licensor" shall mean the copyright owner or entity authorized by
|
||||
the copyright owner that is granting the License.
|
||||
|
||||
"Legal Entity" shall mean the union of the acting entity and all
|
||||
other entities that control, are controlled by, or are under common
|
||||
control with that entity. For the purposes of this definition,
|
||||
"control" means (i) the power, direct or indirect, to cause the
|
||||
direction or management of such entity, whether by contract or
|
||||
otherwise, or (ii) ownership of fifty percent (50%) or more of the
|
||||
outstanding shares, or (iii) beneficial ownership of such entity.
|
||||
|
||||
"You" (or "Your") shall mean an individual or Legal Entity
|
||||
exercising permissions granted by this License.
|
||||
|
||||
"Source" form shall mean the preferred form for making modifications,
|
||||
including but not limited to software source code, documentation
|
||||
source, and configuration files.
|
||||
|
||||
"Object" form shall mean any form resulting from mechanical
|
||||
transformation or translation of a Source form, including but
|
||||
not limited to compiled object code, generated documentation,
|
||||
and conversions to other media types.
|
||||
|
||||
"Work" shall mean the work of authorship, whether in Source or
|
||||
Object form, made available under the License, as indicated by a
|
||||
copyright notice that is included in or attached to the work
|
||||
(an example is provided in the Appendix below).
|
||||
|
||||
"Derivative Works" shall mean any work, whether in Source or Object
|
||||
form, that is based on (or derived from) the Work and for which the
|
||||
editorial revisions, annotations, elaborations, or other modifications
|
||||
represent, as a whole, an original work of authorship. For the purposes
|
||||
of this License, Derivative Works shall not include works that remain
|
||||
separable from, or merely link (or bind by name) to the interfaces of,
|
||||
the Work and Derivative Works thereof.
|
||||
|
||||
"Contribution" shall mean any work of authorship, including
|
||||
the original version of the Work and any modifications or additions
|
||||
to that Work or Derivative Works thereof, that is intentionally
|
||||
submitted to Licensor for inclusion in the Work by the copyright owner
|
||||
or by an individual or Legal Entity authorized to submit on behalf of
|
||||
the copyright owner. For the purposes of this definition, "submitted"
|
||||
means any form of electronic, verbal, or written communication sent
|
||||
to the Licensor or its representatives, including but not limited to
|
||||
communication on electronic mailing lists, source code control systems,
|
||||
and issue tracking systems that are managed by, or on behalf of, the
|
||||
Licensor for the purpose of discussing and improving the Work, but
|
||||
excluding communication that is conspicuously marked or otherwise
|
||||
designated in writing by the copyright owner as "Not a Contribution."
|
||||
|
||||
"Contributor" shall mean Licensor and any individual or Legal Entity
|
||||
on behalf of whom a Contribution has been received by Licensor and
|
||||
subsequently incorporated within the Work.
|
||||
|
||||
2. Grant of Copyright License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
copyright license to reproduce, prepare Derivative Works of,
|
||||
publicly display, publicly perform, sublicense, and distribute the
|
||||
Work and such Derivative Works in Source or Object form.
|
||||
|
||||
3. Grant of Patent License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
(except as stated in this section) patent license to make, have made,
|
||||
use, offer to sell, sell, import, and otherwise transfer the Work,
|
||||
where such license applies only to those patent claims licensable
|
||||
by such Contributor that are necessarily infringed by their
|
||||
Contribution(s) alone or by combination of their Contribution(s)
|
||||
with the Work to which such Contribution(s) was submitted. If You
|
||||
institute patent litigation against any entity (including a
|
||||
cross-claim or counterclaim in a lawsuit) alleging that the Work
|
||||
or a Contribution incorporated within the Work constitutes direct
|
||||
or contributory patent infringement, then any patent licenses
|
||||
granted to You under this License for that Work shall terminate
|
||||
as of the date such litigation is filed.
|
||||
|
||||
4. Redistribution. You may reproduce and distribute copies of the
|
||||
Work or Derivative Works thereof in any medium, with or without
|
||||
modifications, and in Source or Object form, provided that You
|
||||
meet the following conditions:
|
||||
|
||||
(a) You must give any other recipients of the Work or
|
||||
Derivative Works a copy of this License; and
|
||||
|
||||
(b) You must cause any modified files to carry prominent notices
|
||||
stating that You changed the files; and
|
||||
|
||||
(c) You must retain, in the Source form of any Derivative Works
|
||||
that You distribute, all copyright, patent, trademark, and
|
||||
attribution notices from the Source form of the Work,
|
||||
excluding those notices that do not pertain to any part of
|
||||
the Derivative Works; and
|
||||
|
||||
(d) If the Work includes a "NOTICE" text file as part of its
|
||||
distribution, then any Derivative Works that You distribute must
|
||||
include a readable copy of the attribution notices contained
|
||||
within such NOTICE file, excluding those notices that do not
|
||||
pertain to any part of the Derivative Works, in at least one
|
||||
of the following places: within a NOTICE text file distributed
|
||||
as part of the Derivative Works; within the Source form or
|
||||
documentation, if provided along with the Derivative Works; or,
|
||||
within a display generated by the Derivative Works, if and
|
||||
wherever such third-party notices normally appear. The contents
|
||||
of the NOTICE file are for informational purposes only and
|
||||
do not modify the License. You may add Your own attribution
|
||||
notices within Derivative Works that You distribute, alongside
|
||||
or as an addendum to the NOTICE text from the Work, provided
|
||||
that such additional attribution notices cannot be construed
|
||||
as modifying the License.
|
||||
|
||||
You may add Your own copyright statement to Your modifications and
|
||||
may provide additional or different license terms and conditions
|
||||
for use, reproduction, or distribution of Your modifications, or
|
||||
for any such Derivative Works as a whole, provided Your use,
|
||||
reproduction, and distribution of the Work otherwise complies with
|
||||
the conditions stated in this License.
|
||||
|
||||
5. Submission of Contributions. Unless You explicitly state otherwise,
|
||||
any Contribution intentionally submitted for inclusion in the Work
|
||||
by You to the Licensor shall be under the terms and conditions of
|
||||
this License, without any additional terms or conditions.
|
||||
Notwithstanding the above, nothing herein shall supersede or modify
|
||||
the terms of any separate license agreement you may have executed
|
||||
with Licensor regarding such Contributions.
|
||||
|
||||
6. Trademarks. This License does not grant permission to use the trade
|
||||
names, trademarks, service marks, or product names of the Licensor,
|
||||
except as required for reasonable and customary use in describing the
|
||||
origin of the Work and reproducing the content of the NOTICE file.
|
||||
|
||||
7. Disclaimer of Warranty. Unless required by applicable law or
|
||||
agreed to in writing, Licensor provides the Work (and each
|
||||
Contributor provides its Contributions) on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
||||
implied, including, without limitation, any warranties or conditions
|
||||
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
|
||||
PARTICULAR PURPOSE. You are solely responsible for determining the
|
||||
appropriateness of using or redistributing the Work and assume any
|
||||
risks associated with Your exercise of permissions under this License.
|
||||
|
||||
8. Limitation of Liability. In no event and under no legal theory,
|
||||
whether in tort (including negligence), contract, or otherwise,
|
||||
unless required by applicable law (such as deliberate and grossly
|
||||
negligent acts) or agreed to in writing, shall any Contributor be
|
||||
liable to You for damages, including any direct, indirect, special,
|
||||
incidental, or consequential damages of any character arising as a
|
||||
result of this License or out of the use or inability to use the
|
||||
Work (including but not limited to damages for loss of goodwill,
|
||||
work stoppage, computer failure or malfunction, or any and all
|
||||
other commercial damages or losses), even if such Contributor
|
||||
has been advised of the possibility of such damages.
|
||||
|
||||
9. Accepting Warranty or Additional Liability. While redistributing
|
||||
the Work or Derivative Works thereof, You may choose to offer,
|
||||
and charge a fee for, acceptance of support, warranty, indemnity,
|
||||
or other liability obligations and/or rights consistent with this
|
||||
License. However, in accepting such obligations, You may act only
|
||||
on Your own behalf and on Your sole responsibility, not on behalf
|
||||
of any other Contributor, and only if You agree to indemnify,
|
||||
defend, and hold each Contributor harmless for any liability
|
||||
incurred by, or claims asserted against, such Contributor by reason
|
||||
of your accepting any such warranty or additional liability.
|
||||
|
||||
END OF TERMS AND CONDITIONS
|
||||
|
||||
APPENDIX: How to apply the Apache License to your work.
|
||||
|
||||
To apply the Apache License to your work, attach the following
|
||||
boilerplate notice, with the fields enclosed by brackets "[]"
|
||||
replaced with your own identifying information. (Don't include
|
||||
the brackets!) The text should be enclosed in the appropriate
|
||||
comment syntax for the file format. We also recommend that a
|
||||
file or class name and description of purpose be included on the
|
||||
same "printed page" as the copyright notice for easier
|
||||
identification within third-party archives.
|
||||
|
||||
Copyright [yyyy] [name of copyright owner]
|
||||
|
||||
Licensed under the Apache License, Version 2.0 (the "License");
|
||||
you may not use this file except in compliance with the License.
|
||||
You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing, software
|
||||
distributed under the License is distributed on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
See the License for the specific language governing permissions and
|
||||
limitations under the License.
|
||||
11
vendor/github.com/go-openapi/errors/README.md
generated
vendored
Normal file
11
vendor/github.com/go-openapi/errors/README.md
generated
vendored
Normal file
|
|
@ -0,0 +1,11 @@
|
|||
# OpenAPI errors
|
||||
|
||||
[](https://travis-ci.org/go-openapi/errors)
|
||||
[](https://codecov.io/gh/go-openapi/errors)
|
||||
[](https://slackin.goswagger.io)
|
||||
[](https://raw.githubusercontent.com/go-openapi/errors/master/LICENSE)
|
||||
[](https://pkg.go.dev/github.com/go-openapi/errors)
|
||||
[](https://golangci.com)
|
||||
[](https://goreportcard.com/report/github.com/go-openapi/errors)
|
||||
|
||||
Shared errors and error interface used throughout the various libraries found in the go-openapi toolkit.
|
||||
182
vendor/github.com/go-openapi/errors/api.go
generated
vendored
Normal file
182
vendor/github.com/go-openapi/errors/api.go
generated
vendored
Normal file
|
|
@ -0,0 +1,182 @@
|
|||
// Copyright 2015 go-swagger maintainers
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 (the "License");
|
||||
// you may not use this file except in compliance with the License.
|
||||
// You may obtain a copy of the License at
|
||||
//
|
||||
// http://www.apache.org/licenses/LICENSE-2.0
|
||||
//
|
||||
// Unless required by applicable law or agreed to in writing, software
|
||||
// distributed under the License is distributed on an "AS IS" BASIS,
|
||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
// See the License for the specific language governing permissions and
|
||||
// limitations under the License.
|
||||
|
||||
package errors
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
"fmt"
|
||||
"net/http"
|
||||
"reflect"
|
||||
"strings"
|
||||
)
|
||||
|
||||
// DefaultHTTPCode is used when the error Code cannot be used as an HTTP code.
|
||||
var DefaultHTTPCode = http.StatusUnprocessableEntity
|
||||
|
||||
// Error represents a error interface all swagger framework errors implement
|
||||
type Error interface {
|
||||
error
|
||||
Code() int32
|
||||
}
|
||||
|
||||
type apiError struct {
|
||||
code int32
|
||||
message string
|
||||
}
|
||||
|
||||
func (a *apiError) Error() string {
|
||||
return a.message
|
||||
}
|
||||
|
||||
func (a *apiError) Code() int32 {
|
||||
return a.code
|
||||
}
|
||||
|
||||
// MarshalJSON implements the JSON encoding interface
|
||||
func (a apiError) MarshalJSON() ([]byte, error) {
|
||||
return json.Marshal(map[string]interface{}{
|
||||
"code": a.code,
|
||||
"message": a.message,
|
||||
})
|
||||
}
|
||||
|
||||
// New creates a new API error with a code and a message
|
||||
func New(code int32, message string, args ...interface{}) Error {
|
||||
if len(args) > 0 {
|
||||
return &apiError{code, fmt.Sprintf(message, args...)}
|
||||
}
|
||||
return &apiError{code, message}
|
||||
}
|
||||
|
||||
// NotFound creates a new not found error
|
||||
func NotFound(message string, args ...interface{}) Error {
|
||||
if message == "" {
|
||||
message = "Not found"
|
||||
}
|
||||
return New(http.StatusNotFound, fmt.Sprintf(message, args...))
|
||||
}
|
||||
|
||||
// NotImplemented creates a new not implemented error
|
||||
func NotImplemented(message string) Error {
|
||||
return New(http.StatusNotImplemented, message)
|
||||
}
|
||||
|
||||
// MethodNotAllowedError represents an error for when the path matches but the method doesn't
|
||||
type MethodNotAllowedError struct {
|
||||
code int32
|
||||
Allowed []string
|
||||
message string
|
||||
}
|
||||
|
||||
func (m *MethodNotAllowedError) Error() string {
|
||||
return m.message
|
||||
}
|
||||
|
||||
// Code the error code
|
||||
func (m *MethodNotAllowedError) Code() int32 {
|
||||
return m.code
|
||||
}
|
||||
|
||||
// MarshalJSON implements the JSON encoding interface
|
||||
func (m MethodNotAllowedError) MarshalJSON() ([]byte, error) {
|
||||
return json.Marshal(map[string]interface{}{
|
||||
"code": m.code,
|
||||
"message": m.message,
|
||||
"allowed": m.Allowed,
|
||||
})
|
||||
}
|
||||
|
||||
func errorAsJSON(err Error) []byte {
|
||||
//nolint:errchkjson
|
||||
b, _ := json.Marshal(struct {
|
||||
Code int32 `json:"code"`
|
||||
Message string `json:"message"`
|
||||
}{err.Code(), err.Error()})
|
||||
return b
|
||||
}
|
||||
|
||||
func flattenComposite(errs *CompositeError) *CompositeError {
|
||||
var res []error
|
||||
for _, er := range errs.Errors {
|
||||
switch e := er.(type) {
|
||||
case *CompositeError:
|
||||
if e != nil && len(e.Errors) > 0 {
|
||||
flat := flattenComposite(e)
|
||||
if len(flat.Errors) > 0 {
|
||||
res = append(res, flat.Errors...)
|
||||
}
|
||||
}
|
||||
default:
|
||||
if e != nil {
|
||||
res = append(res, e)
|
||||
}
|
||||
}
|
||||
}
|
||||
return CompositeValidationError(res...)
|
||||
}
|
||||
|
||||
// MethodNotAllowed creates a new method not allowed error
|
||||
func MethodNotAllowed(requested string, allow []string) Error {
|
||||
msg := fmt.Sprintf("method %s is not allowed, but [%s] are", requested, strings.Join(allow, ","))
|
||||
return &MethodNotAllowedError{code: http.StatusMethodNotAllowed, Allowed: allow, message: msg}
|
||||
}
|
||||
|
||||
// ServeError the error handler interface implementation
|
||||
func ServeError(rw http.ResponseWriter, r *http.Request, err error) {
|
||||
rw.Header().Set("Content-Type", "application/json")
|
||||
switch e := err.(type) {
|
||||
case *CompositeError:
|
||||
er := flattenComposite(e)
|
||||
// strips composite errors to first element only
|
||||
if len(er.Errors) > 0 {
|
||||
ServeError(rw, r, er.Errors[0])
|
||||
} else {
|
||||
// guard against empty CompositeError (invalid construct)
|
||||
ServeError(rw, r, nil)
|
||||
}
|
||||
case *MethodNotAllowedError:
|
||||
rw.Header().Add("Allow", strings.Join(e.Allowed, ","))
|
||||
rw.WriteHeader(asHTTPCode(int(e.Code())))
|
||||
if r == nil || r.Method != http.MethodHead {
|
||||
_, _ = rw.Write(errorAsJSON(e))
|
||||
}
|
||||
case Error:
|
||||
value := reflect.ValueOf(e)
|
||||
if value.Kind() == reflect.Ptr && value.IsNil() {
|
||||
rw.WriteHeader(http.StatusInternalServerError)
|
||||
_, _ = rw.Write(errorAsJSON(New(http.StatusInternalServerError, "Unknown error")))
|
||||
return
|
||||
}
|
||||
rw.WriteHeader(asHTTPCode(int(e.Code())))
|
||||
if r == nil || r.Method != http.MethodHead {
|
||||
_, _ = rw.Write(errorAsJSON(e))
|
||||
}
|
||||
case nil:
|
||||
rw.WriteHeader(http.StatusInternalServerError)
|
||||
_, _ = rw.Write(errorAsJSON(New(http.StatusInternalServerError, "Unknown error")))
|
||||
default:
|
||||
rw.WriteHeader(http.StatusInternalServerError)
|
||||
if r == nil || r.Method != http.MethodHead {
|
||||
_, _ = rw.Write(errorAsJSON(New(http.StatusInternalServerError, err.Error())))
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
func asHTTPCode(input int) int {
|
||||
if input >= 600 {
|
||||
return DefaultHTTPCode
|
||||
}
|
||||
return input
|
||||
}
|
||||
22
vendor/github.com/go-openapi/errors/auth.go
generated
vendored
Normal file
22
vendor/github.com/go-openapi/errors/auth.go
generated
vendored
Normal file
|
|
@ -0,0 +1,22 @@
|
|||
// Copyright 2015 go-swagger maintainers
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 (the "License");
|
||||
// you may not use this file except in compliance with the License.
|
||||
// You may obtain a copy of the License at
|
||||
//
|
||||
// http://www.apache.org/licenses/LICENSE-2.0
|
||||
//
|
||||
// Unless required by applicable law or agreed to in writing, software
|
||||
// distributed under the License is distributed on an "AS IS" BASIS,
|
||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
// See the License for the specific language governing permissions and
|
||||
// limitations under the License.
|
||||
|
||||
package errors
|
||||
|
||||
import "net/http"
|
||||
|
||||
// Unauthenticated returns an unauthenticated error
|
||||
func Unauthenticated(scheme string) Error {
|
||||
return New(http.StatusUnauthorized, "unauthenticated for %s", scheme)
|
||||
}
|
||||
26
vendor/github.com/go-openapi/errors/doc.go
generated
vendored
Normal file
26
vendor/github.com/go-openapi/errors/doc.go
generated
vendored
Normal file
|
|
@ -0,0 +1,26 @@
|
|||
// Copyright 2015 go-swagger maintainers
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 (the "License");
|
||||
// you may not use this file except in compliance with the License.
|
||||
// You may obtain a copy of the License at
|
||||
//
|
||||
// http://www.apache.org/licenses/LICENSE-2.0
|
||||
//
|
||||
// Unless required by applicable law or agreed to in writing, software
|
||||
// distributed under the License is distributed on an "AS IS" BASIS,
|
||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
// See the License for the specific language governing permissions and
|
||||
// limitations under the License.
|
||||
|
||||
/*
|
||||
Package errors provides an Error interface and several concrete types
|
||||
implementing this interface to manage API errors and JSON-schema validation
|
||||
errors.
|
||||
|
||||
A middleware handler ServeError() is provided to serve the errors types
|
||||
it defines.
|
||||
|
||||
It is used throughout the various go-openapi toolkit libraries
|
||||
(https://github.com/go-openapi).
|
||||
*/
|
||||
package errors
|
||||
103
vendor/github.com/go-openapi/errors/headers.go
generated
vendored
Normal file
103
vendor/github.com/go-openapi/errors/headers.go
generated
vendored
Normal file
|
|
@ -0,0 +1,103 @@
|
|||
// Copyright 2015 go-swagger maintainers
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 (the "License");
|
||||
// you may not use this file except in compliance with the License.
|
||||
// You may obtain a copy of the License at
|
||||
//
|
||||
// http://www.apache.org/licenses/LICENSE-2.0
|
||||
//
|
||||
// Unless required by applicable law or agreed to in writing, software
|
||||
// distributed under the License is distributed on an "AS IS" BASIS,
|
||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
// See the License for the specific language governing permissions and
|
||||
// limitations under the License.
|
||||
|
||||
package errors
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
"fmt"
|
||||
"net/http"
|
||||
)
|
||||
|
||||
// Validation represents a failure of a precondition
|
||||
type Validation struct {
|
||||
code int32
|
||||
Name string
|
||||
In string
|
||||
Value interface{}
|
||||
message string
|
||||
Values []interface{}
|
||||
}
|
||||
|
||||
func (e *Validation) Error() string {
|
||||
return e.message
|
||||
}
|
||||
|
||||
// Code the error code
|
||||
func (e *Validation) Code() int32 {
|
||||
return e.code
|
||||
}
|
||||
|
||||
// MarshalJSON implements the JSON encoding interface
|
||||
func (e Validation) MarshalJSON() ([]byte, error) {
|
||||
return json.Marshal(map[string]interface{}{
|
||||
"code": e.code,
|
||||
"message": e.message,
|
||||
"in": e.In,
|
||||
"name": e.Name,
|
||||
"value": e.Value,
|
||||
"values": e.Values,
|
||||
})
|
||||
}
|
||||
|
||||
// ValidateName sets the name for a validation or updates it for a nested property
|
||||
func (e *Validation) ValidateName(name string) *Validation {
|
||||
if name != "" {
|
||||
if e.Name == "" {
|
||||
e.Name = name
|
||||
e.message = name + e.message
|
||||
} else {
|
||||
e.Name = name + "." + e.Name
|
||||
e.message = name + "." + e.message
|
||||
}
|
||||
}
|
||||
return e
|
||||
}
|
||||
|
||||
const (
|
||||
contentTypeFail = `unsupported media type %q, only %v are allowed`
|
||||
responseFormatFail = `unsupported media type requested, only %v are available`
|
||||
)
|
||||
|
||||
// InvalidContentType error for an invalid content type
|
||||
func InvalidContentType(value string, allowed []string) *Validation {
|
||||
values := make([]interface{}, 0, len(allowed))
|
||||
for _, v := range allowed {
|
||||
values = append(values, v)
|
||||
}
|
||||
return &Validation{
|
||||
code: http.StatusUnsupportedMediaType,
|
||||
Name: "Content-Type",
|
||||
In: "header",
|
||||
Value: value,
|
||||
Values: values,
|
||||
message: fmt.Sprintf(contentTypeFail, value, allowed),
|
||||
}
|
||||
}
|
||||
|
||||
// InvalidResponseFormat error for an unacceptable response format request
|
||||
func InvalidResponseFormat(value string, allowed []string) *Validation {
|
||||
values := make([]interface{}, 0, len(allowed))
|
||||
for _, v := range allowed {
|
||||
values = append(values, v)
|
||||
}
|
||||
return &Validation{
|
||||
code: http.StatusNotAcceptable,
|
||||
Name: "Accept",
|
||||
In: "header",
|
||||
Value: value,
|
||||
Values: values,
|
||||
message: fmt.Sprintf(responseFormatFail, allowed),
|
||||
}
|
||||
}
|
||||
50
vendor/github.com/go-openapi/errors/middleware.go
generated
vendored
Normal file
50
vendor/github.com/go-openapi/errors/middleware.go
generated
vendored
Normal file
|
|
@ -0,0 +1,50 @@
|
|||
// Copyright 2015 go-swagger maintainers
|
||||
//
|
||||
// Licensed under the Apache License, Version 2.0 (the "License");
|
||||
// you may not use this file except in compliance with the License.
|
||||
// You may obtain a copy of the License at
|
||||
//
|
||||
// http://www.apache.org/licenses/LICENSE-2.0
|
||||
//
|
||||
// Unless required by applicable law or agreed to in writing, software
|
||||
// distributed under the License is distributed on an "AS IS" BASIS,
|
||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
// See the License for the specific language governing permissions and
|
||||
// limitations under the License.
|
||||
|
||||
package errors
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"fmt"
|
||||
"strings"
|
||||
)
|
||||
|
||||
// APIVerificationFailed is an error that contains all the missing info for a mismatched section
|
||||
// between the api registrations and the api spec
|
||||
type APIVerificationFailed struct {
|
||||
Section string `json:"section,omitempty"`
|
||||
MissingSpecification []string `json:"missingSpecification,omitempty"`
|
||||
MissingRegistration []string `json:"missingRegistration,omitempty"`
|
||||
}
|
||||
|
||||
func (v *APIVerificationFailed) Error() string {
|
||||
buf := bytes.NewBuffer(nil)
|
||||
|
||||
hasRegMissing := len(v.MissingRegistration) > 0
|
||||
hasSpecMissing := len(v.MissingSpecification) > 0
|
||||
|
||||
if hasRegMissing {
|
||||
buf.WriteString(fmt.Sprintf("missing [%s] %s registrations", strings.Join(v.MissingRegistration, ", "), v.Section))
|
||||
}
|
||||
|
||||
if hasRegMissing && hasSpecMissing {
|
||||
buf.WriteString("\n")
|
||||
}
|
||||
|
||||
if hasSpecMissing {
|
||||
buf.WriteString(fmt.Sprintf("missing from spec file [%s] %s", strings.Join(v.MissingSpecification, ", "), v.Section))
|
||||
}
|
||||
|
||||
return buf.String()
|
||||
}
|
||||
Some files were not shown because too many files have changed in this diff Show more
Loading…
Add table
Add a link
Reference in a new issue